Scalable Estimation - Experiment Basics

Introduction

This workbook is an introduction to the basic concepts and designs relating to the paper

Fast estimation of sparse quantum noise by Harper, Yu and Flammia

This workbook is going to go through the basic ideas behind experimental design for a trivial 6 qubit system.

Initial things to note about the example

This is of marginal utility of only a 6 qubit system! There are only 4096 Paulis to measure in a 6 qubit system. The protocol requires a minimum number of 4n+2 experiments, each measuring (2^6) 64 possible outcomes. 2664 = 1664 measurements - we can reconstruct all the Paulis for not much more than this! For this type of protocol we are assuming a $\delta$ of about $0.25$, ie $4^{0.256}$, so we are expecting that we will only be seeking to recover the 8-9 highest weight Paulis - obviously as system size increases that is where the algorithm shines. So for instance with the 14 qubit example, in a later workbook - it makes a lot more sense.


Warning: As mentioned above this is a long and tedious workbook that is, ulitmately, in some sense disappointing. Because we simulate the entire experiment we can (and do) look at all the intermediate results. Six qubits takes about a minute to simulate a run, and we need to do 4*6+2 of them. At then end we get 8 (well often closer to 12) numbers that are *approximately* correct. I'll try and include a qiskit simulation, which will have a less realistic noise model - but will be more qubits. Then, of course, there is the work book which was used for the paper, it is based off actual 14 qubit experimental data and recovers a lot more information (because it is a bigger system). But if you want the whole gory detail, the system here is small enough to grasp everything that is happening but big enough to be not entirely trivial, although it's a lot of work for very little (because of the small size of the system).


Assumed knowledge

The workbook assumes some familiarity with the basic decoder, which is detailed in the workbook Scalabale Estimate - Basic Concepts. Another starter workbook that might be useful is Hadamard Basics and Observations - which also goes through the Hadamard transform (there is some duplication) but then shows how it fits in with the SuperOperator representation, the impact of only being able to measure commuting Paulis (in the one experiment) and how that relates to the circuits we design and the measurements we make. We will be using these concepts in the circuits we simulate here, and I probably won't go through it again here.

Because we are simulating the experiments here it is a bit painful - but I wanted to have an exact noise model so we could check we were in fact recovering what we expectes.

The other workbook released with the code will show how the algorithm performs in practice with data derived from an IBM Quantum Experience 14 qubit machine.

This is the place to start if you want to understand how the local stabiliser circuits are formed and the various decoding bins collated.

I have left volumes and volumes of data in the cells. If you are trying to debug your own decoder - that will be helpful.

Software needed

For this introductory notebook, we need minimal software. All these packages should be available through the Julia package manager. However, we will need some to help with the simulation etc.

If you get an error trying to "use" them the error message tells you how to load them.

In [1]:
using Hadamard
using PyPlot
# convenience (type /otimes<tab>) - <tab> is the "tab" key.
 = kron
# type /oplus<tab>
 = (x,y)->mod.(x+y,2)
Out[1]:
#3 (generic function with 1 method)
In [2]:
# I love this add on, especially as some of the simulations take a noticeable time. 
using ProgressMeter
In [3]:
# We are going to need some more of my code
# You can get it by doing the following, in the main julia repl (hit ']' to get to the package manager)
# pkg> add https://github.com/rharper2/Juqst.jl

# Currently there is harmless warning re overloading Hadamard.


using Juqst
In [4]:
# This is the code in this github that implements the various peeling algorithms for us.
include("peel.jl")
Out[4]:
Main.PEEL
In [5]:
using Main.PEEL

Some preliminary information.

This is replicated from "Scalable Estimation - Basic Concepts"

Conventions

What's in a name?

There are a number of conventions as to where which qubit should be. Here we are going to adopt a least significant digit approach - which is different from the normal 'ket' approach.

So for example: IZ means and I 'Pauli' on qubit = 2 and a Z 'Pauli' on qubit = 1 (indexing off 1).

Arrays indexed starting with 1.

For those less familiar with Julia, unlike - say - python all arrays and vectors are indexed off 1. Without going into the merits or otherwise of this, we just need to keep it in mind. With our bitstring the bitstring 0000 represents the two qubits II, it has value 0, but it will index the first value in our vector i.e. 1.

Representing Paulis with bitstrings.

There are many ways to represent Paulis with bit strings, including for instance the convention used in Improved Simulation of Stabilizer Circuits, Scott Aaronson and Daniel Gottesman, arXiv:quant-ph/0406196v5.

Here we are going to use one that allows us to naturally translate the Pauli to its position in our vector of Pauli eigenvalues (of course this is arbitrary, we could map them however we like).

The mapping I am going to use is this (together with the least significant convention):


  • I $\rightarrow$ 00
  • X $\rightarrow$ 01
  • Y $\rightarrow$ 10
  • Z $\rightarrow$ 11

This then naturally translates as below:

SuperOperator - Pauli basis

We have defined our SuperOperator basis to be as below, which means with the julia vector starting at 1 we have :

Pauli Vector Index --- Binary Integer
II 1 0000 0
IX 2 0001 1
IY 3 0010 2
IZ 4 0011 3
XI 5 0100 4
XX 6 0101 5
XY 7 0110 6
XZ 8 0111 7
YI 9 1000 8
YX 10 1001 9
YY 11 1010 10
YZ 12 1011 11
ZI 13 1100 12
ZX 14 1101 13
ZY 15 1110 14
ZZ 16 1111 15

I have set this out in painful detail, because its important to understand the mapping for the rest to make sense.

So for instance in our EIGENVALUE vector (think superoperator diagonal), the PAUL YX has binary representation Y=10 X=01, therefore 1001, has "binary-value" 9 and is the 10th (9+1) entry in our eigenvalue vector.

What's in a Walsh Hadamard Transform?

The standard Walsh-Hadmard transform is based off tensor (Kronecker) products of the following matrix:

$$\left(\begin{array}{cc}1 & 1\\1 & -1\end{array}\right)^{\otimes n}$$

WHT_natural

So for one qubit it would be:

$\begin{array}{cc} & \begin{array}{cccc} \quad 00 & \quad 01 & \quad 10 &\quad 11 \end{array}\\ \begin{array}{c} 00\\ 01\\ 10\\11\end{array} & \left(\begin{array}{cccc} \quad 1&\quad 1& \quad 1 &\quad 1\\ \quad 1 &\quad -1 & \quad 1 &\quad -1\\ \quad 1&\quad 1 & \quad -1 &\quad -1\\ \quad 1 &\quad -1 & \quad -1 &\quad 1\\\end{array}\right) \end{array}$

where I have included above (and to the left) of the transform matrix the binary representations of the position and the matrix can also be calculated as $(-1)^{\langle i,j\rangle}$ where the inner product here is the binary inner product $i,j\in\mathbb{F}_2^n$ as $\langle i,j\rangle=\sum^{n-1}_{t=0}i[t]j[t]$ with arithmetic over $\mathbb{F}_2$

WHT_Pauli

In the paper we use a different form of the Walsh-Hadamard transform. In this case we use the inner product of the Paulis, not the 'binary bitstring' inner product. The matrix is subtly different some rows or, if you prefer, columns are swapped:

$\begin{array}{cc} & \begin{array}{cccc} I(00) & X(01) & Y(10) & Z(11) \end{array}\\ \begin{array}{c} I(00)\\ X(01)\\ Y(10)\\Z(11)\end{array} & \left(\begin{array}{cccc} \quad 1&\quad 1& \quad 1 &\quad 1\\ \quad 1 &\quad 1 & \quad -1 &\quad -1\\ \quad 1&\quad -1 & \quad 1 &\quad -1\\ \quad 1 &\quad -1 & \quad -1 &\quad 1\\\end{array}\right) \end{array}$

Which one to use

When transforming Pauli eigenvalue to the probability of a particular error occuring, there are distinct advantages in using the WHT_Pauli transform. The order of the Pauli errors and the order of the Pauli eigenvalues remains the same. However, most common packages (including the one we are going to use here in Julia) don't support this type of transform, rather they implement the WHT_natural transform. The WHT_natural transform also makes the peeling algorithm slightly less fiddly. However it does mean we need to be very careful about the order of things. If we use the WHT_natural transform then the following relationship holds - note the indices (labels) of the Paulis in the probability vector:

Example of Index change

So the natural translation (labelling of Paulis) then becomes as follows:

Eigenvalue vector space

  • I $\rightarrow$ 00
  • X $\rightarrow$ 01
  • Y $\rightarrow$ 10
  • Z $\rightarrow$ 11

Probability vector space

  • I $\rightarrow$ 00
  • X $\rightarrow$ 10
  • Y $\rightarrow$ 01
  • Z $\rightarrow$ 11

The numbers above are the numbers we are going used in the "Basic concepts" workbook - here we are going to set up some more general errors

Set up some Pauli Errors to find

In [6]:
# Some functions to give us labels:
function probabilityLabels(x;qubits=2)
    str = string(x,base=4,pad=qubits)
    paulis = ['I','Y','X','Z']
    return map(x->paulis[parse(Int,x)+1],str)
end

function fidelityLabels(x;qubits=2)
    str = string(x,base=4,pad=qubits)
    paulis = ['I','X','Y','Z']
    return map(x->paulis[parse(Int,x)+1],str)
end
Out[6]:
fidelityLabels (generic function with 1 method)
In [7]:
# To demonstrate the peeling decoder we are going to set up a sparse (fake) distribution.
# This is going to be a 4 qubit (so 4^6 = 4096 possible probabilities)
# Because of the protocol - we are assuming SPARSE errors - so most of the qubits
# Will be good, but there will be a few errors for us to find.

dist = zeros(4096)
cl = zeros(6,4)

# Qubit 1
cl[1,2] = 0.01 #y1
cl[1,3] = 0.004 #x1
cl[1,4] = 0 #z1
cl[1,1] = 1-sum(cl[1,:])

# Qubit 2
cl[2,2] = 0.0003
cl[2,3] = 0.0001
cl[2,4]=  0
cl[2,1] = 1-sum(cl[2,:])

# Qubit 3
cl[3,2]  = 0.0002
cl[3,3] = 0.0001
cl[3,4] = 0.0003
cl[3,1] = 1-sum(cl[3,:])

# Qubit 4
cl[4,2] = 1e-4
cl[4,3] = 3e-4
cl[4,4] = 0
cl[4,1] = 1-sum(cl[4,:])


# Qubit 5
cl[5,2] = 0.004 
cl[5,3] = 0.02 
cl[5,4] = 0 #z1
cl[5,1] = 1-sum(cl[5,:])


# Qubit 6
cl[6,2] = 1e-4
cl[6,3] = 3e-4
cl[6,4] = 0
cl[6,1] = 1-sum(cl[6,:])




for q1 in 1:4
    for q2 in 1:4
        for q3 in 1:4
            for q4 in 1:4
                for q5 in 1:4
                    for q6 in 1:4
                        dist[(q6-1)*4^5+(q5-1)*4^4+(q4-1)*4^3+(q3-1)*4^2+(q2-1)*4+q1] = cl[1,q1]*cl[2,q2]*cl[3,q3]*cl[4,q4]*cl[5,q5]*cl[6,q6]
                    end
                end
            end
        end
    end
end

dist[3*4^4+2*16+3*4+3+1]= 0.004 # <----- Add an unexpected IZIXZZ error!
dist[1] = 0
dist[1] = 1-sum(dist)
for i in [1e-4,1e-5,1e-6,1e-7]
    print("Number greater than $i = $(count([x>i for x in dist]))\n")
end

print("\nThe Paulis errors we might hope to recover with this protocol:\n")
for (ix,i) in enumerate(dist)
    if i > 0.0001
        print("$(string(ix,pad=3)) $(probabilityLabels(ix-1,qubits=6)) $i\n")
    end
end
title("Distribution of Pauli error rates")
yscale("log")
ylabel("Error Rate")
xlabel("Paulis indexed and sorted by error rate")
scatter(2:50,reverse(sort(dist))[2:50])

print("\n\nThe actual oracle i.e. the eigenvalues of the Pauli channel we are going to use.\n\n")
actualOracle = round.(ifwht_natural(dist),digits=10)
Number greater than 0.0001 = 12
Number greater than 1.0e-5 = 19
Number greater than 1.0e-6 = 41
Number greater than 1.0e-7 = 55

The Paulis errors we might hope to recover with this protocol:
001 IIIIII 0.9566049496644947
002 IIIIIY 0.009742443708564856
003 IIIIIX 0.0038969774834259428
005 IIIIYI 0.0002882968036207967
017 IIIYII 0.00019223633173193815
049 IIIZII 0.0002883544975979071
129 IIXIII 0.0002882968036207967
257 IYIIII 0.003936905531411864
513 IXIIII 0.01968452765705932
514 IXIIIY 0.0001996402399296077
816 IZIXZZ 0.004
2049 XIIIII 0.0002882968036207967


The actual oracle i.e. the eigenvalues of the Pauli channel we are going to use.

Out[7]:
4096-element Array{Float64,1}:
 1.0
 0.972
 0.984
 0.972
 0.9914
 0.979412
 0.9914048
 0.9634168
 0.9918
 0.979804
 0.9918016
 0.9638056
 0.9992
 ⋮
 0.9493371786
 0.922350435
 0.9337424811
 0.9227557376
 0.9497171414
 0.9227227986
 0.9341194043
 0.9231250615
 0.9411471971
 0.9301642532
 0.9415540196
 0.9145710756
In [8]:
# So we can use our actual oracle to simulate an experiment, we can include some so we will
# Rather than simulate a whole lot of Pauli circuits, which would serve to diagonlise the noise
# I am going to take the simpler state of assuming we have a Pauli channel - the SPAM will
# Therefore just get added on to the end.

# Here I am just assuming a suitable 6 qubit, independent random noise channel for SPAM
rm = foldl(,[randomFidelityNoise() for _ in 1:6])
print("Our random SPAM channel has fidelity $(fidelity(rm)) and unitarity $(unitarity(rm))\n")
Our random SPAM channel has fidelity 0.8539479347605721 and unitarity 0.7615487972527085
In [9]:
# Our noise channel is a Pauli channel, so just diagonalise our eigenvalue vector

using LinearAlgebra
noise = diagm(0=>actualOracle);
In [10]:
# So we have everything we need, we have SPAM channels and the 'diagonal' channel we would get if we 
# averaged over Paulis.

So Let's get started!

Remember the point here is that the eigenvalues are dense, the pauli error rates are sparse. However, we can't sample the Pauli error rates in a SPAM (state preperation and measurement) error free way. We CAN however sample the eigenvalues in a SPAM free way. We wan't to sparsely sample the dense eigenvalues in order to reconstruct the sparse error probabilities.

This shows how to do this.

Choose our sub-sampling matrices

One of the main ideas behind the papers is that we can use the protocols in Efficient learning of quantum channels and Efficient learning of quantum noise to learn the eigenvalues of an entire stabiliser group ($2^n$) entries at once to arbitrary precision. Whilst it might be quite difficult to learn the eigenvalues of an arbitrary group as this will require an arbitrary $n-$qubit Clifford gate (which can be a lot of primitive gates!) even today's noisy devices can quite easily create a 2-local Stabiliser group over $n-$ qubits.

Finally we are simulating the recovery in a 6 qubit system. That means our bitstrings are 12 bits long.

Our experiments will need two sub-sampling groups. The first subsampling group will be two of our potential MuBs (set out below) (selected randomly). The second subsampling group will have single MUBs (potentialSingles below) on the first and fourth qubit, and a potential (two qubit) MuB on qubits 2 and 3.

This maximises the seperation of Paulis using local stabiliser (two qubit) groups.

In [11]:
potentialSingles = [
                    [[0,0],[0,1]], # IX
                    [[0,0],[1,0]], # IY
                    [[0,0],[1,1]], # IZ
                    ]


all2QlMuBs =  [  [[0,0,0,0],[1,1,0,1],[1,0,1,1],[0,1,1,0]], #II ZX YZ XY
                 [[0,0,0,0],[1,1,1,0],[0,1,1,1],[1,0,0,1]], #II ZY XZ YX
                 [[0,0,0,0],[0,0,0,1],[0,1,0,0],[0,1,0,1]], #II IX XI XX
                 [[0,0,0,0],[0,0,1,0],[1,0,0,0],[1,0,1,0]], #II IY YI YY
                 [[0,0,0,0],[0,0,1,1],[1,1,0,0],[1,1,1,1]]] #II IZ ZI ZZ

# We only want to choose the first two types for the initial runs
potentialMuBs = [all2QlMuBs[1],all2QlMuBs[2]]
Out[11]:
2-element Array{Array{Array{Int64,1},1},1}:
 [[0, 0, 0, 0], [1, 1, 0, 1], [1, 0, 1, 1], [0, 1, 1, 0]]
 [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]
In [12]:
paulisAll=[]
mappings=[]
experiments = []

# For this example I am just going to choose the second type of MUB.
# For six qubits we need three "potentialMuB"

for i = 1:1
     push!(mappings,Dict())
     choose = rand(1:2,3)
     push!(experiments,vcat([(2,choose[1])],[(2,choose[2])],[(2,choose[3])]))
     push!(paulisAll,vcat([potentialMuBs[x] for x in choose]))
end


# For the next subsample group, choose a single, two doubles and a single.
for i = 1:1
         push!(mappings,Dict())
         chooseS = rand(1:3,2)
         choose = rand(1:2,2)
         push!(experiments,vcat([(1,chooseS[1])],[(2,choose[1])],[(2,choose[2])],[(1,chooseS[2])]))
         push!(paulisAll,vcat([potentialSingles[chooseS[1]]],[potentialMuBs[x] for x in choose],[potentialSingles[chooseS[2]]]))
end


# Create the 'bit' offsets
# This is used to work out the Pauli we isolate in a single bin. Here we have 2*(n=2), 4 bits per Pauli
ds = vcat(
    [[0 for _ = 1:12]],
    [map(x->parse(Int,x),collect(reverse(string(i,base=2,pad=12)))) for i in [2^b for b=0:11]]);
# e.g.
print("For the offsets we are using the simplest method for Pauli identification eg:\n")
for (ix,i) in enumerate(ds)
    print("Offset $(ix-1): $i\n")
end

print("\n\nWe have selected two sub-sampling groups. \n\nThe first is two randomly selected 'two quibt' mubs on  the qubits:\n")
print("Qubits 1&2: $(paulisAll[1][1])\n")
print("Qubits 3&4: $(paulisAll[1][2])\n")
print("Qubits 5&6: $(paulisAll[1][3])\n\n")

print("\n\nThe second, we want to offset by 1 qubit, so we have single qubit mubs on qubit 1, a two qubit mub on 2&3 and 4&5 and then we need a single qubit mub on qubit 6.\n")
print("Qubit 1   : $(paulisAll[2][1])\n")
print("Qubits 2&3: $(paulisAll[2][2])\n")
print("Qubits 2&3: $(paulisAll[2][3])\n")
print("Qubit 4.  : $(paulisAll[2][4])\n")
For the offsets we are using the simplest method for Pauli identification eg:
Offset 0: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
Offset 1: [1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
Offset 2: [0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
Offset 3: [0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0]
Offset 4: [0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0]
Offset 5: [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0]
Offset 6: [0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0]
Offset 7: [0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0]
Offset 8: [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0]
Offset 9: [0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0]
Offset 10: [0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0]
Offset 11: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0]
Offset 12: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1]


We have selected two sub-sampling groups. 

The first is two randomly selected 'two quibt' mubs on  the qubits:
Qubits 1&2: [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]
Qubits 3&4: [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]
Qubits 5&6: [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]



The second, we want to offset by 1 qubit, so we have single qubit mubs on qubit 1, a two qubit mub on 2&3 and 4&5 and then we need a single qubit mub on qubit 6.
Qubit 1   : [[0, 0], [1, 1]]
Qubits 2&3: [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]
Qubits 2&3: [[0, 0, 0, 0], [1, 1, 0, 1], [1, 0, 1, 1], [0, 1, 1, 0]]
Qubit 4.  : [[0, 0], [1, 1]]
In [13]:
# Here there are two sets of experiments
# The tuple for each qubit, is the (number, experiment type), 
# the numbers for each will add up to 4 (since we have 4 qubits)
experiments
Out[13]:
2-element Array{Any,1}:
 [(2, 2), (2, 2), (2, 2)]
 [(1, 3), (2, 2), (2, 1), (1, 3)]

The Experiments

So what are these experiments?

Let's pull in the figure from the paper which shows all the experimental designs:

diagram showing designs

Okay that's a bit busy, lets break it down.

Step 1 was to choose one MUB from the set for each pair of qubits, well that is what we did in PaulisAll, lets look at the first element of that

In [14]:
paulisAll[1]
Out[14]:
3-element Array{Array{Array{Int64,1},1},1}:
 [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]
 [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]
 [[0, 0, 0, 0], [1, 1, 1, 0], [0, 1, 1, 1], [1, 0, 0, 1]]

So because it was randomly chosen, you will have to look at that and convince yourself that indeed that each of the two elements shown above are one of these four sets of MuBs.

So the first experiment (the top row of (2)) is just an experiment to extract those eigenvalues.

How do we do that?

Well we have the circuits we need in the appendix of the paper - they look like this:

circuits

Where what we are going to need on each two qubit pair is either (a) or (c), depending on the MUB randomly selected

Setting up the circuits

For the two qubit expermeints, circuit 1 is (a) above and circuit (2) is (c) above.

Effectively we take an input state in the computational basis, apply these circuits, do a Pauli twirl of varing gates, reverse the circuit and measure in the computational basis.

To simulate this I am just going to set up the circuits as a superoperator. The way rb Pauli twirling works, the noise will be subsumed into the SPAM error - which we will then fit out, but I am getting ahead of myself.

Let's create the superoperator that does (a) and (c) on two qubits:


For a more detailed explanation of what I am doing in the next cell, look at the Hadamard Basics and Observations workbook


In [15]:
using Juqst
# Set up the relevant gates
cnot21 = [1 0;0 1][1 0;0 0]+ [0 1;1 0][0 0;0 1]
superCnot21 = makeSuper(cnot21)
superPI = makeSuper([1 0;0 im][1 0;0 1])
superIP = makeSuper([1 0;0 1][1 0;0 im])
superHH = makeSuper((1/sqrt(2)*[1 1;1 -1])(1/sqrt(2)*[1 1;1 -1]));
superH = makeSuper((1/sqrt(2)*[1 1;1 -1]))
superP = makeSuper([1 0;0 im])
# The two qubit experiment are (a) and (b) above
# Recall circuits are left to right, matrices right to left
circuit2q = [superCnot21*superPI*superCnot21*superIP*superHH,
             superCnot21*superPI*superCnot21*superPI*superHH]
# The single qubit ones are Paulis, X,Y,Z
circuit1q = [superH,superP*superH,makeSuper([1 0;0 1])];

So the first experiment is as shown in the above diagram ... this bit:

experiment

where our C's (left hand side of that diagram - which I have just noticed is (irritatingly) right to left (you can tell from the |0> on the right). Each of the circuits 2q calculated above represents one of our two choices of Mubs, we would then twirl with Paulis, average over lots of different twirls (to diagonalise it) and then reverse out and measure.

I am NOT going to simulate the Pauli twirl, but rather I am just going to use the (already) diagonal noise matrix we calculated earlier. The noisy circuits2q and state and measurement errors will be simulated by using the measurement (SPAM) noise matrix formed earlier.

(I probably will do the whole thing in a qiskit simulation)

In [16]:
# So the circuits we want are shown here
experiments[1]
Out[16]:
3-element Array{Tuple{Int64,Int64},1}:
 (2, 2)
 (2, 2)
 (2, 2)
In [17]:
function generateGates(experiment,circuits)
    # We expect the gates as a tuple, i.e. (a,b) where a is number of qubits and b is the experiment number
    # circuits is a an array of size upto number of qubits, so circuits[a][b] returns an "a" qubit experiment
    # we can then use foldl to kron up the circuits
    return foldl(, [circuits[x[1]][x[2]] for x in reverse(experiment)])
end
Out[17]:
generateGates (generic function with 1 method)
In [18]:
# Simplify the code here, we know its two qubits:
# Note the order of the \kron - obviously with an experiment you just set up the actual experiment!

# You can think of generateGates, doing the equivalent of the following
#initialGates = circuit2q[experiments[1][3][2]]⊗circuit2q[experiments[1][2][2]]⊗circuit2q[experiments[1][1][2]]
initialGates = generateGates(experiments[1],[circuit1q,circuit2q])

reverseGates = transpose(initialGates); # yay superoperators.
In [19]:
# So there is a bit of delay here, mainly its the genZs which isn't the most efficient, but we only do it once.

# We can generate the computational basis measurment vectors (again see Hadamard Basics and Observations for details)
zs = Juqst.genZs(6);
start = zs[1]; #computational basis all 0 state

Faking (simulating) the experiment

In theory we could put this in qiskit or cirq or something - but here I want to simulate a very specific noise channel to show we can recover it, and these simulators only allow more general noise (at the time of writing this).

So the basic concept is:

  • we start with the computational basis vector,
  • we apply our SPAM
  • we apply our basis change circuit (initial Gates)
  • we apply our diagonal noise $m$ times (this is the noise we want to measure - see below for choice of $m$)
  • we undo our basis change
  • we find our final probability vector (for observed errors in this basis) (basically each of the measurment arrays we generates - the zs)
  • then we face a whole lot of random dice rolls to build up some limited shot statistics.

Finally repeat all the above for different values of $m$. Here we are just going to use m = 1,3,5,8,10,15,20,40,60,80

In [20]:
experiment1_allProbs = []
# These will depend on your system
lengths =  [1,3,5,8,10,15,20,40,60,80]
@showprogress for m in lengths
    wholeCircuit = reverseGates*noise^m*initialGates*rm*start./64 # ./64 = Normalise the measurement/state vectors
    probs = [z'*wholeCircuit for z in zs]
    push!(experiment1_allProbs,probs)
end

# SO NOTE that although there are 4096 eigenvalues, we can only measure 64 of them at a time, 
# so the probability matrices are only 64 long.
Progress: 100%|█████████████████████████████████████████| Time: 0:01:27
In [21]:
# Quick Sanity check that all of our probabilityvectors sum to 1
for (ix,i) in enumerate(experiment1_allProbs)
    print("M=$(lengths[ix]) sum = $(round(sum(i),digits=9))\n")
end
M=1 sum = 1.0
M=3 sum = 1.0
M=5 sum = 1.0
M=8 sum = 1.0
M=10 sum = 1.0
M=15 sum = 1.0
M=20 sum = 1.0
M=40 sum = 1.0
M=60 sum = 1.0
M=80 sum = 1.0
In [22]:
# So the final step, now that we have the probability matrix for that experiment, is to simulate limited shot statistics.

# So if we imagined doing 100 sequences, of 200 measurements per seq thats 20,000 shots.
const shotsToDo = 20000

# Quick helper function
function shotSimulator(size,shots,cumulativeMatrix)
    toRet = []
    for todo = 1:length(cumulativeMatrix)
        counted = zeros(size)
        rolls = rand(shots)
        for i in rolls
            #using the fact that the cululativeMatrix is (effectively) sorted.
            counted[searchsortedfirst(cumulativeMatrix[todo],i,)] +=1
        end
        push!(toRet,counted./shots)
    end
    return toRet
end

# We need culmlative probability matrixes
cumMatrix = map(cumsum,experiment1_allProbs)
experiment1_observed = shotSimulator(64,shotsToDo,cumMatrix);

The next step is to recreate the eigenvalues that we have 'measured' in the experiment.

This is just following the protocol in https://arxiv.org/abs/1907.13022, Efficient Learning of Quantum Noise (and https://arxiv.org/abs/1907.12976 Efficient Estimation of Pauli Channels)

We want to Hadamard transform the observed Probabilities

if we do that we will see for each eigenvalue (element in the matrix) an exponential decay curve - lets look

In [23]:
experiment1_SpamyEigenvalues = map(ifwht_natural,experiment1_observed);
# Dont bother with the first element as its all 1.
for x = 2:64
    toPlot = [d[x] for d in experiment1_SpamyEigenvalues]
    plot(lengths,toPlot)
end

So fit and extract

What we want to do is fit them to and exponential decay curve and extract the decay factor. Juqst has the code for this, and you are welcome to look at the documentation and example notebooks for that!

For those that can't be bothered reading docs! fitTheFidelites returns params that contains both the SPAM component and the fidelity - the second parameter is the number of measurements used in the fit and the last is if any of the fits failed. We just want to extract the eigenvalues:

In [24]:
(params,l, failed) = fitTheFidelities(lengths,experiment1_observed)
experiment1_fidelities = vcat(1,[p[2] for p in params]) # We don't fit the first one, it is always 1 for CPTP maps
Out[24]:
64-element Array{Float64,1}:
 1.0
 0.9832608361319543
 0.9646570732960278
 0.9800387381402719
 0.9903753752127014
 0.9897432673667202
 0.9703648881064589
 0.9707983647823297
 0.9910993219118764
 0.9905212250582103
 0.9712593549596166
 0.9716276673089288
 0.9983868398109271
 ⋮
 0.989878184282456
 0.9729367252562507
 0.9545101559822675
 0.9698695367819199
 0.9906399998241464
 0.9736318429466285
 0.9555478313164661
 0.9707389168429829
 0.9817336194412914
 0.9811482272883943
 0.9616005947070917
 0.9615911546605903

So what did we actually extract?

Note this bit is kind of fidly - to get everything to match up

Well we extracted certain fidelities!

The four fidelities on the first qubit were:

In [25]:
paulisAll[1][1]
Out[25]:
4-element Array{Array{Int64,1},1}:
 [0, 0, 0, 0]
 [1, 1, 1, 0]
 [0, 1, 1, 1]
 [1, 0, 0, 1]
In [26]:
paulisAll[1][2]
Out[26]:
4-element Array{Array{Int64,1},1}:
 [0, 0, 0, 0]
 [1, 1, 1, 0]
 [0, 1, 1, 1]
 [1, 0, 0, 1]
In [27]:
#Turn those into a number...
fidelitiesExtracted = []
function binaryArrayToNumber(x)
    return foldl((y1,y2)->y1<<1+y2,x)
end
for x0 in paulisAll[1][3]
    p56 = binaryArrayToNumber(x0)
    for x1 in paulisAll[1][2]
        p34 = binaryArrayToNumber(x1)
        for x2 in paulisAll[1][1]
            p12= binaryArrayToNumber(x2)
            push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
        end
    end
end
In [28]:
fidelitiesExtracted
Out[28]:
64-element Array{Any,1}:
    1
   15
    8
   10
  225
  239
  232
  234
  113
  127
  120
  122
  145
    ⋮
 2529
 2543
 2536
 2538
 2417
 2431
 2424
 2426
 2449
 2463
 2456
 2458
In [29]:
xx = ifwht_natural([z'*reverseGates*noise*initialGates*start for z in zs]/64)
for p in xx
    print("$p : $(findall(x->isapprox(x,p),actualOracle))\n")
end
1.0000000000000002 : [1]
0.9832064000000004 : [15, 195, 269, 449, 3075, 3329]
0.9634168000000002 : [8, 52]
0.9798040000000003 : [10]
0.9904006400000003 : [45, 225, 3105]
0.9896211041000001 : [239, 493, 3119, 3299, 3373, 3553]
0.9698631548000001 : [232, 2256, 3112, 3216]
0.9702369407000003 : [234, 1230, 3114, 3150]
0.9912001200000004 : [69, 113, 137, 1029, 1073, 2057]
0.9904135538000003 : [127, 381, 1087, 1187, 1223, 1267, 1341, 1441, 1477, 1521, 2147, 2251, 2401, 2505, 3143, 3187, 3211, 3397, 3441, 3465]
0.9706397831000002 : [120, 188, 1080, 1132, 2108]
0.9710202744000004 : [122, 1082]
0.9984006000000005 : [145, 2065]
0.9816210645000005 : [159, 413, 2079, 2259, 2333, 2513, 3219, 3473]
0.9618631160000003 : [152, 1244, 2072, 3164]
0.9782369015000004 : [86, 154, 1046, 2074]
0.9512320000000001 : [525, 705, 3585]
0.9507968975000002 : [719, 3599, 3779]
0.9318140799000001 : [576, 676, 712, 756, 2596, 3592, 3636]
0.9318593505000003 : [610, 714, 1570, 3594]
0.9576978427000002 : [749, 3629, 3809]
0.9412762310000002 : [3823]
0.9223237737000003 : [3816]
0.9383561771000004 : [3818]
0.9584647295000001 : [637, 1597, 1697, 1733, 1777, 2657, 2761, 3653, 3697, 3721]
0.9420363741000003 : [1711, 1791, 2671, 3711]
0.9230687405000002 : [1704, 1784, 2664, 2732, 2812, 3704, 3772]
0.9391075758000004 : [694, 1638, 1706, 1786, 2614, 2666, 3706]
0.9496978043000003 : [669, 2589, 2769, 3729]
0.9492761929000003 : [2783, 3743]
0.9303237365000002 : [2776, 3736]
0.9303561395000003 : [1750, 2778, 3670, 3738]
0.9518095999999999 : [833, 1793]
0.9354397671 : [811, 847, 1807, 1987, 3907]
0.9166038358 : [840, 884, 908, 1800, 1844, 2828]
0.9325868533000001 : [842, 1802]
0.9422873137999999 : [877, 1837, 2017, 3937, 4041]
0.9419308672999999 : [2031, 3951]
0.9231250615 : [2024, 3052, 3944, 4012, 4092]
0.9230953112000001 : [934, 1014, 2026, 2854, 2958, 3894, 3946]
0.9430482664999998 : [1861, 1905, 1929, 2889]
0.9426851285 : [1919, 2951, 2995]
0.9238642636999999 : [1912, 1980, 2940]
0.9238408957 : [1914]
0.9502872757 : [1937, 2897]
0.9339308295 : [1951, 2911]
0.9151250244999999 : [1944, 2904, 2972]
0.9310952738999999 : [1878, 1946, 2906]
0.9834048 : [131, 385, 2051, 2305]
0.9826867828 : [295, 399, 2319, 2499, 3459]
0.9630672783 : [392, 436, 1380, 1484, 2312, 2356, 3404]
0.9633823887000001 : [326, 370, 394, 1286, 1330, 2314]
0.9898191868000001 : [175, 255, 429, 509, 2095, 2275, 2349, 2529, 3135, 3235, 3271, 3315, 3389, 3489, 3525, 3569]
0.9731151128000002 : [2543, 3503, 3583]
0.9535269870000003 : [2536, 3496, 3576]
0.9698287985000003 : [1510, 2538, 3430, 3498, 3578]
0.9906117951000001 : [1159, 1203, 1413, 1457, 2119, 2163, 2187, 2373, 2417, 2441]
0.9739007512000003 : [1471, 2431]
0.9542969401000001 : [1464, 2424, 2492]
0.9706053993000002 : [1398, 1466, 2426]
0.9818191472000001 : [1119, 1373, 2195, 2449]
0.9811150735000002 : [2463]
0.9615269485000001 : [2456]
0.9618287597000003 : [1430, 2390, 2458]
In [30]:
# And (just out of interest we can compare the fidelities we extracted, with the actual values)
for i in 1:64
    print("$(fidelityLabels(fidelitiesExtracted[i]-1,qubits=6)) ($(fidelitiesExtracted[i]-1)):\tEstimate: $(experiment1_fidelities[i]) <-> $(actualOracle[fidelitiesExtracted[i]]) \tPercentage Error: $(round.(abs(actualOracle[fidelitiesExtracted[i]]-experiment1_fidelities[i])/(actualOracle[fidelitiesExtracted[i]])*100,digits=4))%\n")
end
IIIIII (0):	Estimate: 1.0 <-> 1.0 	Percentage Error: 0.0%
IIIIZY (14):	Estimate: 0.9832608361319543 <-> 0.9832064 	Percentage Error: 0.0055%
IIIIXZ (7):	Estimate: 0.9646570732960278 <-> 0.9634168 	Percentage Error: 0.1287%
IIIIYX (9):	Estimate: 0.9800387381402719 <-> 0.979804 	Percentage Error: 0.024%
IIZYII (224):	Estimate: 0.9903753752127014 <-> 0.99040064 	Percentage Error: 0.0026%
IIZYZY (238):	Estimate: 0.9897432673667202 <-> 0.9896211041 	Percentage Error: 0.0123%
IIZYXZ (231):	Estimate: 0.9703648881064589 <-> 0.9698631548 	Percentage Error: 0.0517%
IIZYYX (233):	Estimate: 0.9707983647823297 <-> 0.9702369407 	Percentage Error: 0.0579%
IIXZII (112):	Estimate: 0.9910993219118764 <-> 0.99120012 	Percentage Error: 0.0102%
IIXZZY (126):	Estimate: 0.9905212250582103 <-> 0.9904135538 	Percentage Error: 0.0109%
IIXZXZ (119):	Estimate: 0.9712593549596166 <-> 0.9706397831 	Percentage Error: 0.0638%
IIXZYX (121):	Estimate: 0.9716276673089288 <-> 0.9710202744 	Percentage Error: 0.0626%
IIYXII (144):	Estimate: 0.9983868398109271 <-> 0.9984006 	Percentage Error: 0.0014%
IIYXZY (158):	Estimate: 0.9816396544088348 <-> 0.9816210645 	Percentage Error: 0.0019%
IIYXXZ (151):	Estimate: 0.9630306745343353 <-> 0.961863116 	Percentage Error: 0.1214%
IIYXYX (153):	Estimate: 0.9783072392054106 <-> 0.9782369015 	Percentage Error: 0.0072%
ZYIIII (3584):	Estimate: 0.9514835150444942 <-> 0.951232 	Percentage Error: 0.0264%
ZYIIZY (3598):	Estimate: 0.9515309733577308 <-> 0.9507968975 	Percentage Error: 0.0772%
ZYIIXZ (3591):	Estimate: 0.932643098614912 <-> 0.9318140799 	Percentage Error: 0.089%
ZYIIYX (3593):	Estimate: 0.9322540498379835 <-> 0.9318593505 	Percentage Error: 0.0424%
ZYZYII (3808):	Estimate: 0.9579087801820729 <-> 0.9576978427 	Percentage Error: 0.022%
ZYZYZY (3822):	Estimate: 0.9417971158393882 <-> 0.941276231 	Percentage Error: 0.0553%
ZYZYXZ (3815):	Estimate: 0.9229103493504338 <-> 0.9223237737 	Percentage Error: 0.0636%
ZYZYYX (3817):	Estimate: 0.938529492706552 <-> 0.9383561771 	Percentage Error: 0.0185%
ZYXZII (3696):	Estimate: 0.9586122537568325 <-> 0.9584647295 	Percentage Error: 0.0154%
ZYXZZY (3710):	Estimate: 0.9425888723713586 <-> 0.9420363741 	Percentage Error: 0.0586%
ZYXZXZ (3703):	Estimate: 0.9241770495676275 <-> 0.9230687405 	Percentage Error: 0.1201%
ZYXZYX (3705):	Estimate: 0.9392281393389654 <-> 0.9391075758 	Percentage Error: 0.0128%
ZYYXII (3728):	Estimate: 0.9499896036110115 <-> 0.9496978043 	Percentage Error: 0.0307%
ZYYXZY (3742):	Estimate: 0.9500263013509378 <-> 0.9492761929 	Percentage Error: 0.079%
ZYYXXZ (3735):	Estimate: 0.9310705623964437 <-> 0.9303237365 	Percentage Error: 0.0803%
ZYYXYX (3737):	Estimate: 0.9304279718914741 <-> 0.9303561395 	Percentage Error: 0.0077%
XZIIII (1792):	Estimate: 0.9523829811985322 <-> 0.9518096 	Percentage Error: 0.0602%
XZIIZY (1806):	Estimate: 0.9354014070361333 <-> 0.9354397671 	Percentage Error: 0.0041%
XZIIXZ (1799):	Estimate: 0.9164784117221473 <-> 0.9166038358 	Percentage Error: 0.0137%
XZIIYX (1801):	Estimate: 0.9324552055488328 <-> 0.9325868533 	Percentage Error: 0.0141%
XZZYII (2016):	Estimate: 0.9428764991933171 <-> 0.9422873138 	Percentage Error: 0.0625%
XZZYZY (2030):	Estimate: 0.9424099413395579 <-> 0.9419308673 	Percentage Error: 0.0509%
XZZYXZ (2023):	Estimate: 0.9231068616750641 <-> 0.9231250615 	Percentage Error: 0.002%
XZZYYX (2025):	Estimate: 0.9226096191597515 <-> 0.9230953112 	Percentage Error: 0.0526%
XZXZII (1904):	Estimate: 0.9433893508802024 <-> 0.9430482665 	Percentage Error: 0.0362%
XZXZZY (1918):	Estimate: 0.9433295308810477 <-> 0.9426851285 	Percentage Error: 0.0684%
XZXZXZ (1911):	Estimate: 0.924157841802963 <-> 0.9238642637 	Percentage Error: 0.0318%
XZXZYX (1913):	Estimate: 0.9233739259911855 <-> 0.9238408957 	Percentage Error: 0.0505%
XZYXII (1936):	Estimate: 0.9506967945081665 <-> 0.9502872757 	Percentage Error: 0.0431%
XZYXZY (1950):	Estimate: 0.9337481845917214 <-> 0.9339308295 	Percentage Error: 0.0196%
XZYXXZ (1943):	Estimate: 0.9153037424583959 <-> 0.9151250245 	Percentage Error: 0.0195%
XZYXYX (1945):	Estimate: 0.930701917126143 <-> 0.9310952739 	Percentage Error: 0.0422%
YXIIII (2304):	Estimate: 0.9833897349521105 <-> 0.9834048 	Percentage Error: 0.0015%
YXIIZY (2318):	Estimate: 0.9828025932436972 <-> 0.9826867828 	Percentage Error: 0.0118%
YXIIXZ (2311):	Estimate: 0.9632490925574481 <-> 0.9630672783 	Percentage Error: 0.0189%
YXIIYX (2313):	Estimate: 0.9637617576218745 <-> 0.9633823887 	Percentage Error: 0.0394%
YXZYII (2528):	Estimate: 0.989878184282456 <-> 0.9898191868 	Percentage Error: 0.006%
YXZYZY (2542):	Estimate: 0.9729367252562507 <-> 0.9731151128 	Percentage Error: 0.0183%
YXZYXZ (2535):	Estimate: 0.9545101559822675 <-> 0.953526987 	Percentage Error: 0.1031%
YXZYYX (2537):	Estimate: 0.9698695367819199 <-> 0.9698287985 	Percentage Error: 0.0042%
YXXZII (2416):	Estimate: 0.9906399998241464 <-> 0.9906117951 	Percentage Error: 0.0028%
YXXZZY (2430):	Estimate: 0.9736318429466285 <-> 0.9739007512 	Percentage Error: 0.0276%
YXXZXZ (2423):	Estimate: 0.9555478313164661 <-> 0.9542969401 	Percentage Error: 0.1311%
YXXZYX (2425):	Estimate: 0.9707389168429829 <-> 0.9706053993 	Percentage Error: 0.0138%
YXYXII (2448):	Estimate: 0.9817336194412914 <-> 0.9818191472 	Percentage Error: 0.0087%
YXYXZY (2462):	Estimate: 0.9811482272883943 <-> 0.9811150735 	Percentage Error: 0.0034%
YXYXXZ (2455):	Estimate: 0.9616005947070917 <-> 0.9615269485 	Percentage Error: 0.0077%
YXYXYX (2457):	Estimate: 0.9615911546605903 <-> 0.9618287597 	Percentage Error: 0.0247%

Not bad for 20,000 shots.

Building up our eigenvalue Oracle

So now we can begin to populate our estimating oracle that we will use to run the peeling decoder.

we have 16 values, lets fill them in...

In [31]:
# Lets make it a vector of vectors 
# - why? Well some of the experiments will have duplicate eigenvalue estimates, and we will want
# To end up averaging them.

estimateOracle = [[] for _ in 1:4096]

for i in 1:64
    push!(estimateOracle[fidelitiesExtracted[i]],experiment1_fidelities[i])
end
In [32]:
estimateOracle
Out[32]:
4096-element Array{Array{Any,1},1}:
 [1.0]
 []
 []
 []
 []
 []
 []
 [0.9646570732960278]
 []
 [0.9800387381402719]
 []
 []
 []
 ⋮
 []
 []
 []
 []
 []
 []
 []
 []
 []
 []
 []
 []
In [33]:
# So I am going to just do all of the above for each of the 5 (other than the one already done) for each of the qubits
e1_all_additional_fidelities = []
e1_fidelity_extracted = []
# Note we don't actually need to save the actual probabilities - but I am going to use them later
# To demonstrate some different recovery regimes.
e1_all_actualProbabilities = []
e1_gatesUsed = []
@showprogress 1 "QubitPairs" for qubitPairOn = 1:2 # qubit pairs here are 1&2, 3&4 and 5&6
    for experimentType = 1:2
        if experiments[1][qubitPairOn][2] != experimentType
            # Its one we haven't done
            expOnFirstPair  = experiments[1][1][2]
            expOnSecondPair = experiments[1][2][2]
            expOnThirdPair  = experiments[1][3][2]

            
            if qubitPairOn == 1
                expOnFirstPair = experimentType
            elseif qubitPairOn == 2
                expOnSecondPair = experimentType
            else
                expOnThirdPair = experimentType
            end
            initialGates = circuit2q[expOnThirdPair]⊗circuit2q[expOnSecondPair]⊗circuit2q[expOnFirstPair]
            push!(e1_gatesUsed,initialGates)
            reverseGates = transpose(initialGates) # yay superoperators.
            # So at this point we have set up one of the 'new' experiments.
            additionalExperiment = []
            # Get the actual probabilities.
            @showprogress 1 "QGroup: $qubitPairOn Exp: $experimentType" for m in lengths
                wholeCircuit = rm*reverseGates*noise^m*initialGates*start./64 # Normalising our zs
                probs = [z'*wholeCircuit for z in zs]
                push!(additionalExperiment,probs)
            end
            push!(e1_all_actualProbabilities,additionalExperiment)
            # Generate the measurement statistics
            cumMatrix = map(cumsum,additionalExperiment)
            experiment1_additional_observed = shotSimulator(64,shotsToDo,cumMatrix);
            # Fit and extract the fidelities
            (params,l, failed) = fitTheFidelities(lengths,experiment1_additional_observed)
            experiment1_additional_fidelities = vcat(1,[p[2] for p in params])
            push!(e1_all_additional_fidelities,experiment1_additional_fidelities)
            fidelitiesExtracted=[]
            for x0 in all2QlMuBs[expOnThirdPair]
                p56 = binaryArrayToNumber(x0)
                for x1 in all2QlMuBs[expOnSecondPair]
                    p34 = binaryArrayToNumber(x1)
                    for x2 in all2QlMuBs[expOnFirstPair]
                        p12= binaryArrayToNumber(x2)
                        push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
                    end
                end
            end
            push!(e1_fidelity_extracted,fidelitiesExtracted)
        end
    end
end
QGroup: 1 Exp: 1100%|███████████████████████████████████| Time: 0:01:25
QGroup: 2 Exp: 1100%|███████████████████████████████████| Time: 0:01:22
QubitPairs100%|█████████████████████████████████████████| Time: 0:02:48
In [34]:
#initialGates = generateGates(experiments[1],[circuit1q,circuit2q])
expOnFirstPair  = experiments[1][1][2]
expOnSecondPair = experiments[1][2][2]
expOnThirdPair  = experiments[1][3][2]
print(expOnFirstPair,expOnSecondPair,expOnThirdPair)
            
initialGates = circuit2q[expOnThirdPair]⊗circuit2q[expOnSecondPair]⊗circuit2q[expOnFirstPair]



reverseGates = transpose(initialGates); # yay superoperators.
experiment1_allProbs = []
# These will depend on your system
lengths =  [1,3,5,8,10,15,20,40,60,80]
@showprogress for m in lengths
    wholeCircuit = reverseGates*noise^m*initialGates*rm*start./64 # ./64 = Normalise the measurement/state vectors
    probs = [z'*wholeCircuit for z in zs]
    push!(experiment1_allProbs,probs)
end


# We need culmlative probability matrixes
cumMatrix = map(cumsum,experiment1_allProbs)
experiment1_observed = shotSimulator(64,shotsToDo,cumMatrix);
(params,l, failed) = fitTheFidelities(lengths,experiment1_observed)
experiment1_fidelities = vcat(1,[p[2] for p in params]) # We don't fit the first one, it is always 1 for CPTP maps

fidelitiesExtracted = []
function binaryArrayToNumber(x)
    return foldl((y1,y2)->y1<<1+y2,x)
end

           for x0 in all2QlMuBs[expOnThirdPair]
                p56 = binaryArrayToNumber(x0)
                for x1 in all2QlMuBs[expOnSecondPair]
                    p34 = binaryArrayToNumber(x1)
                    for x2 in all2QlMuBs[expOnFirstPair]
                        p12= binaryArrayToNumber(x2)
                        push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
                    end
                end
            end


# for x0 in paulisAll[1][3]
#     p56 = binaryArrayToNumber(x0)
#     for x1 in paulisAll[1][2]
#         p34 = binaryArrayToNumber(x1)
#         for x2 in paulisAll[1][1]
#             p12= binaryArrayToNumber(x2)
#             push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
#         end
#     end
# end
# And (just out of interest we can compare the fidelities we extracted, with the actual values)
for i in 1:64
    print("$(fidelityLabels(fidelitiesExtracted[i]-1,qubits=6)) ($(fidelitiesExtracted[i]-1)):\tEstimate: $(experiment1_fidelities[i]) <-> $(actualOracle[fidelitiesExtracted[i]]) \tPercentage Error: $(round.(abs(actualOracle[fidelitiesExtracted[i]]-experiment1_fidelities[i])/(actualOracle[fidelitiesExtracted[i]])*100,digits=4))%\n")
end
222
Progress: 100%|█████████████████████████████████████████| Time: 0:01:22
IIIIII (0):	Estimate: 1.0 <-> 1.0 	Percentage Error: 0.0%
IIIIZY (14):	Estimate: 0.9836187322885085 <-> 0.9832064 	Percentage Error: 0.0419%
IIIIXZ (7):	Estimate: 0.9639149560621026 <-> 0.9634168 	Percentage Error: 0.0517%
IIIIYX (9):	Estimate: 0.97991687180008 <-> 0.979804 	Percentage Error: 0.0115%
IIZYII (224):	Estimate: 0.9904880047810369 <-> 0.99040064 	Percentage Error: 0.0088%
IIZYZY (238):	Estimate: 0.9897596835861429 <-> 0.9896211041 	Percentage Error: 0.014%
IIZYXZ (231):	Estimate: 0.970063280711357 <-> 0.9698631548 	Percentage Error: 0.0206%
IIZYYX (233):	Estimate: 0.9704973944327957 <-> 0.9702369407 	Percentage Error: 0.0268%
IIXZII (112):	Estimate: 0.9913602144274899 <-> 0.99120012 	Percentage Error: 0.0162%
IIXZZY (126):	Estimate: 0.9905850799039289 <-> 0.9904135538 	Percentage Error: 0.0173%
IIXZXZ (119):	Estimate: 0.9710270405876867 <-> 0.9706397831 	Percentage Error: 0.0399%
IIXZYX (121):	Estimate: 0.9713669111696482 <-> 0.9710202744 	Percentage Error: 0.0357%
IIYXII (144):	Estimate: 0.9985077456470959 <-> 0.9984006 	Percentage Error: 0.0107%
IIYXZY (158):	Estimate: 0.9820389965856001 <-> 0.9816210645 	Percentage Error: 0.0426%
IIYXXZ (151):	Estimate: 0.9627793127924436 <-> 0.961863116 	Percentage Error: 0.0953%
IIYXYX (153):	Estimate: 0.9785209950264261 <-> 0.9782369015 	Percentage Error: 0.029%
ZYIIII (3584):	Estimate: 0.9512704184526202 <-> 0.951232 	Percentage Error: 0.004%
ZYIIZY (3598):	Estimate: 0.9507057490779569 <-> 0.9507968975 	Percentage Error: 0.0096%
ZYIIXZ (3591):	Estimate: 0.9318305497849405 <-> 0.9318140799 	Percentage Error: 0.0018%
ZYIIYX (3593):	Estimate: 0.9320273704855208 <-> 0.9318593505 	Percentage Error: 0.018%
ZYZYII (3808):	Estimate: 0.9576946852145771 <-> 0.9576978427 	Percentage Error: 0.0003%
ZYZYZY (3822):	Estimate: 0.9412133254382572 <-> 0.941276231 	Percentage Error: 0.0067%
ZYZYXZ (3815):	Estimate: 0.922295169711348 <-> 0.9223237737 	Percentage Error: 0.0031%
ZYZYYX (3817):	Estimate: 0.9383138005903416 <-> 0.9383561771 	Percentage Error: 0.0045%
ZYXZII (3696):	Estimate: 0.9587926560545154 <-> 0.9584647295 	Percentage Error: 0.0342%
ZYXZZY (3710):	Estimate: 0.9425654934879208 <-> 0.9420363741 	Percentage Error: 0.0562%
ZYXZXZ (3703):	Estimate: 0.92339141164174 <-> 0.9230687405 	Percentage Error: 0.035%
ZYXZYX (3705):	Estimate: 0.939465275233001 <-> 0.9391075758 	Percentage Error: 0.0381%
ZYYXII (3728):	Estimate: 0.9505758184090124 <-> 0.9496978043 	Percentage Error: 0.0925%
ZYYXZY (3742):	Estimate: 0.9499205478803402 <-> 0.9492761929 	Percentage Error: 0.0679%
ZYYXXZ (3735):	Estimate: 0.9306643121087222 <-> 0.9303237365 	Percentage Error: 0.0366%
ZYYXYX (3737):	Estimate: 0.9312744179958689 <-> 0.9303561395 	Percentage Error: 0.0987%
XZIIII (1792):	Estimate: 0.9513329862753058 <-> 0.9518096 	Percentage Error: 0.0501%
XZIIZY (1806):	Estimate: 0.9353769438978451 <-> 0.9354397671 	Percentage Error: 0.0067%
XZIIXZ (1799):	Estimate: 0.9165118424555322 <-> 0.9166038358 	Percentage Error: 0.01%
XZIIYX (1801):	Estimate: 0.9319194365558223 <-> 0.9325868533 	Percentage Error: 0.0716%
XZZYII (2016):	Estimate: 0.9421912516986911 <-> 0.9422873138 	Percentage Error: 0.0102%
XZZYZY (2030):	Estimate: 0.9414158013240314 <-> 0.9419308673 	Percentage Error: 0.0547%
XZZYXZ (2023):	Estimate: 0.9225669804339347 <-> 0.9231250615 	Percentage Error: 0.0605%
XZZYYX (2025):	Estimate: 0.9226527128231293 <-> 0.9230953112 	Percentage Error: 0.0479%
XZXZII (1904):	Estimate: 0.9434457325925045 <-> 0.9430482665 	Percentage Error: 0.0421%
XZXZZY (1918):	Estimate: 0.942653009971232 <-> 0.9426851285 	Percentage Error: 0.0034%
XZXZXZ (1911):	Estimate: 0.9238292567898823 <-> 0.9238642637 	Percentage Error: 0.0038%
XZXZYX (1913):	Estimate: 0.9234554719666535 <-> 0.9238408957 	Percentage Error: 0.0417%
XZYXII (1936):	Estimate: 0.9504061076561945 <-> 0.9502872757 	Percentage Error: 0.0125%
XZYXZY (1950):	Estimate: 0.9344223468149836 <-> 0.9339308295 	Percentage Error: 0.0526%
XZYXXZ (1943):	Estimate: 0.9150527789048778 <-> 0.9151250245 	Percentage Error: 0.0079%
XZYXYX (1945):	Estimate: 0.9307148653758337 <-> 0.9310952739 	Percentage Error: 0.0409%
YXIIII (2304):	Estimate: 0.9836592542669564 <-> 0.9834048 	Percentage Error: 0.0259%
YXIIZY (2318):	Estimate: 0.9829948836613357 <-> 0.9826867828 	Percentage Error: 0.0314%
YXIIXZ (2311):	Estimate: 0.9629371510619611 <-> 0.9630672783 	Percentage Error: 0.0135%
YXIIYX (2313):	Estimate: 0.9632530202802688 <-> 0.9633823887 	Percentage Error: 0.0134%
YXZYII (2528):	Estimate: 0.9899561201666218 <-> 0.9898191868 	Percentage Error: 0.0138%
YXZYZY (2542):	Estimate: 0.9734135813520179 <-> 0.9731151128 	Percentage Error: 0.0307%
YXZYXZ (2535):	Estimate: 0.9534757475781179 <-> 0.953526987 	Percentage Error: 0.0054%
YXZYYX (2537):	Estimate: 0.9693840433966625 <-> 0.9698287985 	Percentage Error: 0.0459%
YXXZII (2416):	Estimate: 0.9908507151068997 <-> 0.9906117951 	Percentage Error: 0.0241%
YXXZZY (2430):	Estimate: 0.9743807320915888 <-> 0.9739007512 	Percentage Error: 0.0493%
YXXZXZ (2423):	Estimate: 0.9545487729278829 <-> 0.9542969401 	Percentage Error: 0.0264%
YXXZYX (2425):	Estimate: 0.9704739272266707 <-> 0.9706053993 	Percentage Error: 0.0135%
YXYXII (2448):	Estimate: 0.9822206102326522 <-> 0.9818191472 	Percentage Error: 0.0409%
YXYXZY (2462):	Estimate: 0.9814222407864774 <-> 0.9811150735 	Percentage Error: 0.0313%
YXYXXZ (2455):	Estimate: 0.9615603517693696 <-> 0.9615269485 	Percentage Error: 0.0035%
YXYXYX (2457):	Estimate: 0.9620298364703991 <-> 0.9618287597 	Percentage Error: 0.0209%

So now we need to put into the oracle the offsets for this experiment we will need.

This is step 2 in the diagram from the paper (a long way back up there $\uparrow$.

Basically for each pair of qubits, we have to cycle through four of the two qubit mubs (there are five but we have done one already). That means we have another 8 experiments, i.e. 2n where n is the number of qubits. For each sub-sampling group we need to do 2n+1 experiments. Let's create the initial and reverse gates for each of the 3 Mubs we haven't already got ([II,IX,XI,XX], [II,IY,YI,YY], [II,IZ,ZI,ZZ]])

In [35]:
# We already had the first bit, repeating just to make the cell self-contained.
circuit2q = [superCnot21*superPI*superCnot21*superIP*superHH,
             superCnot21*superPI*superCnot21*superPI*superHH]

push!(circuit2q,superHH) # The XX one
push!(circuit2q,superIP*superPI*superHH) # the YY one
push!(circuit2q,makeSuper([1 0;0 1][1 0;0 1])) # The ZZ one i.e. nothing.
Out[35]:
5-element Array{Array{Float64,2},1}:
 [1.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0]
 [1.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0]
 [1.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0]
 [1.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0; … ; 0.0 0.0 … 0.0 0.0; 0.0 0.0 … 0.0 0.0]
 [1.0 0.0 … 0.0 0.0; 0.0 1.0 … 0.0 0.0; … ; 0.0 0.0 … 1.0 0.0; 0.0 0.0 … 0.0 1.0]
In [36]:
# So I am going to just do all of the above for each of the 5 (other than the one already done) for each of the qubits
e1_all_additional_fidelities = []
e1_fidelity_extracted = []
# Note we don't actually need to save the actual probabilities - but I am going to use them later
# To demonstrate some different recovery regimes.
e1_all_actualProbabilities = []
@showprogress 1 "QubitPairs" for qubitPairOn = 1:3 # qubit pairs here are 1&2, 3&4 and 5&6
    for experimentType = 1:5
        if experiments[1][qubitPairOn][2] != experimentType
            # Its one we haven't done
            expOnFirstPair  = experiments[1][1][2]
            expOnSecondPair = experiments[1][2][2]
            expOnThirdPair  = experiments[1][3][2]

            
            if qubitPairOn == 1
                expOnFirstPair = experimentType
            elseif qubitPairOn == 2
                expOnSecondPair = experimentType
            else
                expOnThirdPair = experimentType
            end
            initialGates = circuit2q[expOnThirdPair]⊗circuit2q[expOnSecondPair]⊗circuit2q[expOnFirstPair]

            reverseGates = transpose(initialGates) # yay superoperators.
            # So at this point we have set up one of the 'new' experiments.
            additionalExperiment = []
            # Get the actual probabilities.
            @showprogress 1 "QGroup: $qubitPairOn Exp: $experimentType" for m in lengths
                wholeCircuit = reverseGates*noise^m*initialGates*rm*start./64 # Normalising our zs
                probs = [z'*wholeCircuit for z in zs]
                push!(additionalExperiment,probs)
            end
            push!(e1_all_actualProbabilities,additionalExperiment)
            # Generate the measurement statistics
            cumMatrix = map(cumsum,additionalExperiment)
            experiment1_additional_observed = shotSimulator(64,shotsToDo,cumMatrix);
            # Fit and extract the fidelities
            (params,l, failed) = fitTheFidelities(lengths,experiment1_additional_observed)
            experiment1_additional_fidelities = vcat(1,[p[2] for p in params])
            push!(e1_all_additional_fidelities,experiment1_additional_fidelities)
            fidelitiesExtracted=[]
            for x0 in all2QlMuBs[expOnThirdPair]
                p56 = binaryArrayToNumber(x0)
                for x1 in all2QlMuBs[expOnSecondPair]
                    p34 = binaryArrayToNumber(x1)
                    for x2 in all2QlMuBs[expOnFirstPair]
                        p12= binaryArrayToNumber(x2)
                        push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
                    end
                end
            end
            push!(e1_fidelity_extracted,fidelitiesExtracted)
        end
    end
end
QGroup: 1 Exp: 1100%|███████████████████████████████████| Time: 0:01:23
QGroup: 1 Exp: 3100%|███████████████████████████████████| Time: 0:01:24
QGroup: 1 Exp: 4100%|███████████████████████████████████| Time: 0:01:22
QGroup: 1 Exp: 5100%|███████████████████████████████████| Time: 0:01:23
QGroup: 2 Exp: 1100%|███████████████████████████████████| Time: 0:01:26
QGroup: 2 Exp: 3100%|███████████████████████████████████| Time: 0:01:22
QGroup: 2 Exp: 4100%|███████████████████████████████████| Time: 0:01:24
QGroup: 2 Exp: 5100%|███████████████████████████████████| Time: 0:01:24
QGroup: 3 Exp: 1100%|███████████████████████████████████| Time: 0:01:23
QGroup: 3 Exp: 3100%|███████████████████████████████████| Time: 0:01:17
QGroup: 3 Exp: 4100%|███████████████████████████████████| Time: 0:01:17
QGroup: 3 Exp: 5100%|███████████████████████████████████| Time: 0:01:17
QubitPairs100%|█████████████████████████████████████████| Time: 0:16:27
In [37]:
# So for example, if we look at some of these additional 12 experiments, we see
# The following - look at the changing eigevalues we sample (left hand side).
# E.g. exerimpent 12 as qubits 5&6 with the the II IZ ZI ZZ stabiliser.

toExtract = 12

# And (just out of interest we can compare the fidelities we extracted, with the actual values)
for i in 1:64
    print("$(fidelityLabels(e1_fidelity_extracted[toExtract][i]-1,qubits=6)) ",
          "($(e1_fidelity_extracted[toExtract][i])): ",
          "\tEstimate: $(round(e1_all_additional_fidelities[toExtract][i],digits=5))",
          " <-> $(round(actualOracle[e1_fidelity_extracted[toExtract][i]],digits=5)) ",
          "\tPercentage Error: $(round.(abs(actualOracle[e1_fidelity_extracted[toExtract][i]]-e1_all_additional_fidelities[toExtract][i])/(actualOracle[e1_fidelity_extracted[toExtract][i]])*100,digits=4))%\n")
    
end
IIIIII (1): 	Estimate: 1.0 <-> 1.0 	Percentage Error: 0.0%
IIIIZY (15): 	Estimate: 0.98344 <-> 0.98321 	Percentage Error: 0.0237%
IIIIXZ (8): 	Estimate: 0.96326 <-> 0.96342 	Percentage Error: 0.016%
IIIIYX (10): 	Estimate: 0.98009 <-> 0.9798 	Percentage Error: 0.0288%
IIZYII (225): 	Estimate: 0.99057 <-> 0.9904 	Percentage Error: 0.0168%
IIZYZY (239): 	Estimate: 0.98959 <-> 0.98962 	Percentage Error: 0.0029%
IIZYXZ (232): 	Estimate: 0.96985 <-> 0.96986 	Percentage Error: 0.0018%
IIZYYX (234): 	Estimate: 0.97078 <-> 0.97024 	Percentage Error: 0.0556%
IIXZII (113): 	Estimate: 0.99144 <-> 0.9912 	Percentage Error: 0.0241%
IIXZZY (127): 	Estimate: 0.99054 <-> 0.99041 	Percentage Error: 0.0127%
IIXZXZ (120): 	Estimate: 0.97058 <-> 0.97064 	Percentage Error: 0.006%
IIXZYX (122): 	Estimate: 0.97178 <-> 0.97102 	Percentage Error: 0.0786%
IIYXII (145): 	Estimate: 0.99838 <-> 0.9984 	Percentage Error: 0.0019%
IIYXZY (159): 	Estimate: 0.98207 <-> 0.98162 	Percentage Error: 0.0457%
IIYXXZ (152): 	Estimate: 0.96182 <-> 0.96186 	Percentage Error: 0.0047%
IIYXYX (154): 	Estimate: 0.9784 <-> 0.97824 	Percentage Error: 0.0164%
IZIIII (769): 	Estimate: 0.95314 <-> 0.952 	Percentage Error: 0.1194%
IZIIZY (783): 	Estimate: 0.93591 <-> 0.93563 	Percentage Error: 0.0298%
IZIIXZ (776): 	Estimate: 0.91672 <-> 0.91679 	Percentage Error: 0.0071%
IZIIYX (778): 	Estimate: 0.9337 <-> 0.93277 	Percentage Error: 0.0995%
IZZYII (993): 	Estimate: 0.94322 <-> 0.94248 	Percentage Error: 0.0788%
IZZYZY (1007): 	Estimate: 0.943 <-> 0.94212 	Percentage Error: 0.0935%
IZZYXZ (1000): 	Estimate: 0.92388 <-> 0.92331 	Percentage Error: 0.0621%
IZZYYX (1002): 	Estimate: 0.92456 <-> 0.92328 	Percentage Error: 0.1388%
IZXZII (881): 	Estimate: 0.94355 <-> 0.94324 	Percentage Error: 0.0328%
IZXZZY (895): 	Estimate: 0.94297 <-> 0.94287 	Percentage Error: 0.0099%
IZXZXZ (888): 	Estimate: 0.9236 <-> 0.92405 	Percentage Error: 0.0485%
IZXZYX (890): 	Estimate: 0.92427 <-> 0.92403 	Percentage Error: 0.0262%
IZYXII (913): 	Estimate: 0.95116 <-> 0.95048 	Percentage Error: 0.0716%
IZYXZY (927): 	Estimate: 0.93411 <-> 0.93412 	Percentage Error: 0.0008%
IZYXXZ (920): 	Estimate: 0.91568 <-> 0.91531 	Percentage Error: 0.0402%
IZYXYX (922): 	Estimate: 0.9319 <-> 0.93128 	Percentage Error: 0.0665%
ZIIIII (3073): 	Estimate: 0.99922 <-> 0.9992 	Percentage Error: 0.0024%
ZIIIZY (3087): 	Estimate: 0.98264 <-> 0.98241 	Percentage Error: 0.0236%
ZIIIXZ (3080): 	Estimate: 0.96244 <-> 0.96264 	Percentage Error: 0.0206%
ZIIIYX (3082): 	Estimate: 0.97929 <-> 0.97902 	Percentage Error: 0.0275%
ZIZYII (3297): 	Estimate: 0.98974 <-> 0.9896 	Percentage Error: 0.0142%
ZIZYZY (3311): 	Estimate: 0.98882 <-> 0.98883 	Percentage Error: 0.0013%
ZIZYXZ (3304): 	Estimate: 0.96902 <-> 0.96909 	Percentage Error: 0.007%
ZIZYYX (3306): 	Estimate: 0.96998 <-> 0.96945 	Percentage Error: 0.0541%
ZIXZII (3185): 	Estimate: 0.99059 <-> 0.9904 	Percentage Error: 0.0193%
ZIXZZY (3199): 	Estimate: 0.98978 <-> 0.98962 	Percentage Error: 0.0165%
ZIXZXZ (3192): 	Estimate: 0.96985 <-> 0.96986 	Percentage Error: 0.0012%
ZIXZYX (3194): 	Estimate: 0.971 <-> 0.97024 	Percentage Error: 0.0786%
ZIYXII (3217): 	Estimate: 0.99761 <-> 0.9976 	Percentage Error: 0.0006%
ZIYXZY (3231): 	Estimate: 0.98131 <-> 0.98083 	Percentage Error: 0.0492%
ZIYXXZ (3224): 	Estimate: 0.96122 <-> 0.96109 	Percentage Error: 0.0136%
ZIYXYX (3226): 	Estimate: 0.97779 <-> 0.97745 	Percentage Error: 0.0339%
ZZIIII (3841): 	Estimate: 0.95228 <-> 0.95124 	Percentage Error: 0.1091%
ZZIIZY (3855): 	Estimate: 0.9351 <-> 0.93487 	Percentage Error: 0.0241%
ZZIIXZ (3848): 	Estimate: 0.91584 <-> 0.91605 	Percentage Error: 0.0232%
ZZIIYX (3850): 	Estimate: 0.93288 <-> 0.93203 	Percentage Error: 0.091%
ZZZYII (4065): 	Estimate: 0.94233 <-> 0.94172 	Percentage Error: 0.0648%
ZZZYZY (4079): 	Estimate: 0.94211 <-> 0.94137 	Percentage Error: 0.079%
ZZZYXZ (4072): 	Estimate: 0.92296 <-> 0.92257 	Percentage Error: 0.0423%
ZZZYYX (4074): 	Estimate: 0.92375 <-> 0.92254 	Percentage Error: 0.1311%
ZZXZII (3953): 	Estimate: 0.94266 <-> 0.94248 	Percentage Error: 0.0196%
ZZXZZY (3967): 	Estimate: 0.94212 <-> 0.94212 	Percentage Error: 0.0001%
ZZXZXZ (3960): 	Estimate: 0.92275 <-> 0.92331 	Percentage Error: 0.0608%
ZZXZYX (3962): 	Estimate: 0.92347 <-> 0.92328 	Percentage Error: 0.0205%
ZZYXII (3985): 	Estimate: 0.95038 <-> 0.94972 	Percentage Error: 0.0695%
ZZYXZY (3999): 	Estimate: 0.93337 <-> 0.93337 	Percentage Error: 0.0009%
ZZYXXZ (3992): 	Estimate: 0.91486 <-> 0.91457 	Percentage Error: 0.0317%
ZZYXYX (3994): 	Estimate: 0.93119 <-> 0.93054 	Percentage Error: 0.0703%
In [38]:
# So we just need to fill in the oracle

for (expNo,x) in enumerate(e1_fidelity_extracted)
    for (fidelityIndex,fidelity)  in enumerate(x)
        push!(estimateOracle[fidelity],e1_all_additional_fidelities[expNo][fidelityIndex])
    end
end
In [39]:
# our oracle so far
for i in 1:4096
    if estimateOracle[i]!=[]
        print("$(string(i,pad=3)) $(round.(actualOracle[i],digits=5)) \t$(round.(estimateOracle[i],digits=5))\n")
    end
end
001 1.0 	[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
002 0.972 	[0.97245]
003 0.984 	[0.98379]
004 0.972 	[0.97222]
005 0.9914 	[0.99144]
006 0.97941 	[0.97957]
007 0.9914 	[0.99154]
008 0.96342 	[0.96466, 0.96398, 0.96294, 0.96325, 0.9632, 0.96397, 0.96343, 0.96334, 0.96326]
009 0.9918 	[0.99173]
010 0.9798 	[0.98004, 0.98015, 0.97996, 0.97959, 0.97987, 0.9798, 0.97983, 0.97964, 0.98009]
011 0.9918 	[0.99173]
012 0.96381 	[0.96422]
013 0.9992 	[0.99924]
014 0.97122 	[0.97099]
015 0.98321 	[0.98326, 0.98301, 0.98303, 0.98328, 0.98312, 0.98329, 0.98322, 0.98338, 0.98344]
016 0.97122 	[0.97164]
017 0.999 	[0.99905]
024 0.96245 	[0.9621]
026 0.97882 	[0.97902]
031 0.98222 	[0.98198]
033 0.9912 	[0.99122]
040 0.97064 	[0.96982]
042 0.97102 	[0.97078]
047 0.99041 	[0.99034]
049 0.9914 	[0.99143]
056 0.97083 	[0.97072]
058 0.97122 	[0.97188]
063 0.99061 	[0.99061]
065 0.9998 	[0.99983]
072 0.96322 	[0.96257]
074 0.97961 	[0.97967]
079 0.98301 	[0.98293]
081 0.9988 	[0.99889]
088 0.96225 	[0.96175]
090 0.97863 	[0.97867]
095 0.98202 	[0.9819]
097 0.991 	[0.99091]
104 0.97045 	[0.97043]
106 0.97082 	[0.97099]
111 0.99022 	[0.99018]
113 0.9912 	[0.9911, 0.99132, 0.99138, 0.99103, 0.99118, 0.99143, 0.99122, 0.99126, 0.99144]
114 0.97922 	[0.97945]
115 0.99121 	[0.99099]
116 0.96322 	[0.96393]
117 0.9986 	[0.99861]
118 0.97063 	[0.97115]
119 0.98261 	[0.98272]
120 0.97064 	[0.97126, 0.9713, 0.97094, 0.97077, 0.97058]
121 0.999 	[0.99898]
122 0.97102 	[0.97163, 0.97103, 0.97088, 0.97107, 0.97178]
123 0.98301 	[0.98273]
124 0.97103 	[0.97133]
125 0.9904 	[0.99044]
126 0.97843 	[0.97817]
127 0.99041 	[0.99052, 0.99049, 0.99049, 0.9904, 0.99054]
128 0.96245 	[0.96317]
129 0.9994 	[0.99942]
136 0.96283 	[0.96246]
138 0.97922 	[0.9786]
143 0.98261 	[0.98276]
145 0.9984 	[0.99839, 0.99843, 0.99843, 0.99828, 0.99835, 0.99856, 0.99835, 0.9983, 0.99838]
146 0.97043 	[0.97101]
147 0.98241 	[0.98191]
148 0.97045 	[0.97054]
149 0.9898 	[0.98999]
150 0.97785 	[0.9781]
151 0.98982 	[0.98993]
152 0.96186 	[0.96303, 0.96228, 0.96186, 0.9618, 0.96182]
153 0.9902 	[0.99001]
154 0.97824 	[0.97831, 0.97833, 0.97787, 0.97816, 0.9784]
155 0.99022 	[0.98994]
156 0.96225 	[0.96273]
157 0.9976 	[0.99757]
158 0.96965 	[0.96954]
159 0.98162 	[0.98164, 0.98198, 0.98148, 0.98163, 0.98207]
160 0.96967 	[0.97002]
161 0.9906 	[0.99076]
168 0.97006 	[0.96884]
170 0.97043 	[0.96998]
175 0.98982 	[0.98971]
177 0.9908 	[0.99058]
184 0.97025 	[0.97032]
186 0.97063 	[0.97082]
191 0.99002 	[0.98983]
193 0.9992 	[0.99919]
200 0.96264 	[0.96275]
202 0.97902 	[0.97906]
207 0.98241 	[0.98238]
209 0.9982 	[0.99823]
216 0.96167 	[0.96222]
218 0.97804 	[0.9784]
223 0.98142 	[0.98127]
225 0.9904 	[0.99038, 0.99055, 0.99058, 0.99016, 0.99042, 0.99059, 0.99034, 0.99046, 0.99057]
226 0.97843 	[0.97859]
227 0.99041 	[0.99021]
228 0.96245 	[0.96336]
229 0.9978 	[0.99777]
230 0.96985 	[0.97035]
231 0.98182 	[0.98204]
232 0.96986 	[0.97036, 0.97029, 0.96985, 0.97001, 0.96985]
233 0.9982 	[0.99811]
234 0.97024 	[0.9708, 0.97016, 0.97003, 0.96999, 0.97078]
235 0.98222 	[0.9818]
236 0.97025 	[0.97049]
237 0.9896 	[0.98967]
238 0.97765 	[0.97746]
239 0.98962 	[0.98974, 0.98969, 0.9898, 0.98967, 0.98959]
240 0.96167 	[0.96259]
241 0.9906 	[0.99058]
248 0.97006 	[0.97013]
250 0.97043 	[0.97112]
255 0.98982 	[0.98974]
257 0.984 	[0.98416]
264 0.96365 	[0.96451]
266 0.96397 	[0.96369]
271 0.98328 	[0.98343]
369 0.99121 	[0.99107]
376 0.95487 	[0.95525]
378 0.97119 	[0.97133]
383 0.97449 	[0.97444]
401 0.98241 	[0.98228]
408 0.9621 	[0.96265]
410 0.96241 	[0.96167]
415 0.9817 	[0.98183]
481 0.99041 	[0.9903]
488 0.9541 	[0.95443]
490 0.97041 	[0.97051]
495 0.9737 	[0.97375]
513 0.952 	[0.95181]
520 0.93256 	[0.93235]
522 0.93261 	[0.93229]
527 0.95156 	[0.95112]
625 0.95923 	[0.95877]
632 0.92381 	[0.92436]
634 0.93986 	[0.93907]
639 0.9428 	[0.94314]
657 0.95046 	[0.95032]
664 0.93107 	[0.93085]
666 0.93111 	[0.93106]
671 0.95004 	[0.94956]
737 0.95846 	[0.95808]
744 0.92307 	[0.92348]
746 0.93911 	[0.93804]
751 0.94204 	[0.94229]
769 0.952 	[0.95314]
776 0.91679 	[0.91672]
778 0.93277 	[0.9337]
783 0.93563 	[0.93591]
881 0.94324 	[0.94355]
888 0.92405 	[0.9236]
890 0.92403 	[0.92427]
895 0.94287 	[0.94297]
913 0.95048 	[0.95116]
920 0.91531 	[0.91568]
922 0.93128 	[0.9319]
927 0.93412 	[0.93411]
993 0.94248 	[0.94322]
1000 0.92331 	[0.92388]
1002 0.92328 	[0.92456]
1007 0.94212 	[0.943]
1025 0.9998 	[0.99981]
1032 0.96322 	[0.96318]
1034 0.97961 	[0.97964]
1039 0.98301 	[0.98303]
1137 0.991 	[0.99103]
1144 0.97045 	[0.97076]
1146 0.97082 	[0.97067]
1151 0.99022 	[0.99032]
1169 0.9982 	[0.99815]
1176 0.96167 	[0.96161]
1178 0.97804 	[0.97769]
1183 0.98142 	[0.98127]
1249 0.9902 	[0.99013]
1256 0.96967 	[0.96964]
1258 0.97004 	[0.96975]
1263 0.98942 	[0.98961]
1281 0.9838 	[0.98392]
1288 0.96345 	[0.96429]
1290 0.96377 	[0.96347]
1295 0.98308 	[0.98325]
1393 0.99101 	[0.99088]
1400 0.95468 	[0.95503]
1402 0.97099 	[0.97113]
1407 0.97429 	[0.97416]
1425 0.98222 	[0.98206]
1432 0.96191 	[0.96242]
1434 0.96222 	[0.96148]
1439 0.98151 	[0.98164]
1505 0.99022 	[0.99009]
1512 0.95391 	[0.95415]
1514 0.97022 	[0.97026]
1519 0.97351 	[0.97342]
1537 0.95181 	[0.95153]
1544 0.93237 	[0.93249]
1546 0.93242 	[0.93138]
1551 0.95137 	[0.9516]
1649 0.95904 	[0.95917]
1656 0.92363 	[0.92448]
1658 0.93967 	[0.93918]
1663 0.94261 	[0.94289]
1681 0.95027 	[0.94976]
1688 0.93088 	[0.93198]
1690 0.93092 	[0.93027]
1695 0.94985 	[0.9503]
1761 0.95827 	[0.95832]
1768 0.92288 	[0.92245]
1770 0.93892 	[0.93812]
1775 0.94185 	[0.94178]
1793 0.95181 	[0.95238, 0.95156, 0.95179, 0.95135, 0.95203, 0.9521, 0.9519, 0.95136, 0.95258]
1794 0.92477 	[0.92446]
1795 0.9362 	[0.93528]
1796 0.92516 	[0.92545]
1797 0.94324 	[0.94294]
1798 0.93221 	[0.93232]
1799 0.94363 	[0.94354]
1800 0.9166 	[0.91648, 0.91646, 0.91656, 0.91784, 0.9183]
1801 0.94362 	[0.94319]
1802 0.93259 	[0.93246, 0.93224, 0.93296, 0.93349, 0.93417]
1803 0.94401 	[0.94303]
1804 0.91697 	[0.91792]
1805 0.95105 	[0.95141]
1806 0.92403 	[0.92447]
1807 0.93544 	[0.9354, 0.93484, 0.93514, 0.93512, 0.93569]
1808 0.92442 	[0.9247]
1809 0.95086 	[0.951]
1816 0.91568 	[0.91548]
1818 0.93165 	[0.93214]
1823 0.9345 	[0.93388]
1825 0.94305 	[0.94298]
1832 0.92386 	[0.92431]
1834 0.92384 	[0.92486]
1839 0.94269 	[0.94205]
1841 0.94324 	[0.94415]
1848 0.92405 	[0.92538]
1850 0.92403 	[0.92629]
1855 0.94287 	[0.94276]
1857 0.95162 	[0.9515]
1864 0.91642 	[0.91693]
1866 0.9324 	[0.93311]
1871 0.93525 	[0.93491]
1873 0.95067 	[0.95053]
1880 0.91549 	[0.91576]
1882 0.93147 	[0.93215]
1887 0.93431 	[0.9336]
1889 0.94286 	[0.94235]
1896 0.92368 	[0.92356]
1898 0.92365 	[0.92281]
1903 0.9425 	[0.94212]
1905 0.94305 	[0.94339, 0.94273, 0.9423, 0.9423, 0.94314]
1906 0.93203 	[0.93152]
1907 0.94344 	[0.94238]
1908 0.91642 	[0.91613]
1909 0.95048 	[0.95032]
1910 0.92347 	[0.92295]
1911 0.93487 	[0.93504]
1912 0.92386 	[0.92416]
1913 0.95086 	[0.95001]
1914 0.92384 	[0.92337]
1915 0.93525 	[0.93367]
1916 0.92423 	[0.92525]
1917 0.94229 	[0.9425]
1918 0.93128 	[0.93184]
1919 0.94269 	[0.94333]
1920 0.91568 	[0.91546]
1921 0.95124 	[0.95061]
1928 0.91605 	[0.91689]
1930 0.93203 	[0.93251]
1935 0.93487 	[0.93434]
1937 0.95029 	[0.9507, 0.95027, 0.94995, 0.94922, 0.94962]
1938 0.92328 	[0.92244]
1939 0.93468 	[0.93311]
1940 0.92368 	[0.92291]
1941 0.94172 	[0.94114]
1942 0.93072 	[0.93039]
1943 0.94212 	[0.94191]
1944 0.91513 	[0.9153]
1945 0.9421 	[0.94118]
1946 0.9311 	[0.9307]
1947 0.9425 	[0.94089]
1948 0.91549 	[0.91673]
1949 0.94953 	[0.94918]
1950 0.92254 	[0.92334]
1951 0.93393 	[0.93375]
1952 0.92294 	[0.92233]
1953 0.94248 	[0.94202]
1960 0.92331 	[0.92313]
1962 0.92328 	[0.92369]
1967 0.94212 	[0.94103]
1969 0.94267 	[0.94262]
1976 0.92349 	[0.92362]
1978 0.92347 	[0.92327]
1983 0.94231 	[0.94203]
1985 0.95105 	[0.95136]
1992 0.91586 	[0.9177]
1994 0.93184 	[0.93347]
1999 0.93469 	[0.93516]
2001 0.9501 	[0.95018]
2008 0.91494 	[0.91412]
2010 0.93091 	[0.93001]
2015 0.93374 	[0.93245]
2017 0.94229 	[0.94288, 0.94269, 0.94172, 0.94149, 0.94266]
2018 0.93128 	[0.93103]
2019 0.94269 	[0.94167]
2020 0.91568 	[0.91585]
2021 0.94972 	[0.94985]
2022 0.92272 	[0.92263]
2023 0.93412 	[0.93443]
2024 0.92313 	[0.92311]
2025 0.9501 	[0.94919]
2026 0.9231 	[0.92261]
2027 0.9345 	[0.93292]
2028 0.92349 	[0.92483]
2029 0.94153 	[0.942]
2030 0.93054 	[0.93122]
2031 0.94193 	[0.94241]
2032 0.91494 	[0.91519]
2033 0.94248 	[0.94316]
2040 0.92331 	[0.92479]
2042 0.92328 	[0.92505]
2047 0.94212 	[0.94181]
2049 0.9994 	[0.99943]
2056 0.96283 	[0.96281]
2058 0.97922 	[0.97906]
2063 0.98261 	[0.98284]
2161 0.9906 	[0.99071]
2168 0.97006 	[0.97017]
2170 0.97043 	[0.97057]
2175 0.98982 	[0.98989]
2193 0.9978 	[0.99772]
2200 0.96128 	[0.96122]
2202 0.97765 	[0.97763]
2207 0.98103 	[0.98104]
2273 0.9898 	[0.98994]
2280 0.96928 	[0.96948]
2282 0.96965 	[0.96948]
2287 0.98903 	[0.98914]
2305 0.9834 	[0.98339, 0.98366, 0.98343, 0.98324, 0.98356, 0.98321, 0.98309, 0.98378, 0.98364]
2306 0.97158 	[0.9719]
2307 0.98347 	[0.98351]
2308 0.95565 	[0.95684]
2309 0.99081 	[0.9906]
2310 0.96299 	[0.96299]
2311 0.97488 	[0.97548]
2312 0.96307 	[0.96325, 0.96355, 0.9629, 0.96307, 0.96264]
2313 0.99121 	[0.99128]
2314 0.96338 	[0.96376, 0.96409, 0.96233, 0.9639, 0.96396]
2315 0.97528 	[0.97523]
2316 0.96345 	[0.96479]
2317 0.98261 	[0.98285]
2318 0.9708 	[0.97093]
2319 0.98269 	[0.9828, 0.98256, 0.98264, 0.98297, 0.98268]
2320 0.95487 	[0.95619]
2321 0.98241 	[0.98209]
2328 0.9621 	[0.96197]
2330 0.96241 	[0.96132]
2335 0.9817 	[0.98188]
2337 0.99061 	[0.99079]
2344 0.9543 	[0.9554]
2346 0.97061 	[0.97088]
2351 0.9739 	[0.9742]
2353 0.99081 	[0.9909]
2360 0.95449 	[0.95424]
2362 0.9708 	[0.97116]
2367 0.9741 	[0.97387]
2369 0.98321 	[0.98292]
2376 0.96287 	[0.96273]
2378 0.96319 	[0.96212]
2383 0.98249 	[0.98251]
2385 0.98222 	[0.98198]
2392 0.96191 	[0.96177]
2394 0.96222 	[0.96108]
2399 0.98151 	[0.98182]
2401 0.99041 	[0.99034]
2408 0.9541 	[0.95492]
2410 0.97041 	[0.97084]
2415 0.9737 	[0.9737]
2417 0.99061 	[0.99064, 0.99078, 0.99049, 0.99059, 0.99054]
2418 0.9628 	[0.96333]
2419 0.97469 	[0.97434]
2420 0.96287 	[0.96322]
2421 0.98202 	[0.98196]
2422 0.97022 	[0.97052]
2423 0.9821 	[0.98276]
2424 0.9543 	[0.95555]
2425 0.98241 	[0.98215]
2426 0.97061 	[0.97074]
2427 0.98249 	[0.98255]
2428 0.95468 	[0.95596]
2429 0.98982 	[0.98981]
2430 0.96202 	[0.96254]
2431 0.9739 	[0.97363]
2432 0.9621 	[0.96279]
2433 0.98281 	[0.98317]
2440 0.96249 	[0.96218]
2442 0.9628 	[0.96306]
2447 0.9821 	[0.98227]
2449 0.98182 	[0.98173, 0.98214, 0.98201, 0.98149, 0.98193]
2450 0.97002 	[0.97049]
2451 0.9819 	[0.98197]
2452 0.9541 	[0.95531]
2453 0.98923 	[0.98912]
2454 0.96144 	[0.9618]
2455 0.97331 	[0.97409]
2456 0.96153 	[0.9616]
2457 0.98962 	[0.98958]
2458 0.96183 	[0.96159]
2459 0.9737 	[0.97363]
2460 0.96191 	[0.96384]
2461 0.98103 	[0.98125]
2462 0.96925 	[0.96961]
2463 0.98112 	[0.98115]
2464 0.95333 	[0.95474]
2465 0.99002 	[0.99015]
2472 0.95372 	[0.95451]
2474 0.97002 	[0.9698]
2479 0.97331 	[0.97335]
2481 0.99022 	[0.98989]
2488 0.95391 	[0.95494]
2490 0.97022 	[0.97082]
2495 0.97351 	[0.97329]
2497 0.98261 	[0.98281]
2504 0.9623 	[0.9618]
2506 0.96261 	[0.96332]
2511 0.9819 	[0.98171]
2513 0.98162 	[0.98141]
2520 0.96133 	[0.96158]
2522 0.96163 	[0.96225]
2527 0.98092 	[0.98075]
2529 0.98982 	[0.98988, 0.98995, 0.98975, 0.9898, 0.98975]
2530 0.96202 	[0.96235]
2531 0.9739 	[0.97386]
2532 0.9621 	[0.9629]
2533 0.98123 	[0.98121]
2534 0.96944 	[0.96963]
2535 0.98131 	[0.98193]
2536 0.95353 	[0.95451]
2537 0.98162 	[0.98138]
2538 0.96983 	[0.96987]
2539 0.9817 	[0.98195]
2540 0.95391 	[0.95514]
2541 0.98903 	[0.98907]
2542 0.96125 	[0.96167]
2543 0.97312 	[0.97294]
2544 0.96133 	[0.96237]
2545 0.99002 	[0.99006]
2552 0.95372 	[0.95373]
2554 0.97002 	[0.97015]
2559 0.97331 	[0.97296]
2561 0.95142 	[0.95128]
2568 0.932 	[0.93179]
2570 0.93205 	[0.93218]
2575 0.95099 	[0.9505]
2673 0.95866 	[0.95806]
2680 0.92326 	[0.9237]
2682 0.9393 	[0.93844]
2687 0.94223 	[0.94256]
2705 0.94989 	[0.9496]
2712 0.93051 	[0.93019]
2714 0.93054 	[0.93041]
2719 0.94947 	[0.94886]
2785 0.95789 	[0.95747]
2792 0.92251 	[0.92287]
2794 0.93854 	[0.93755]
2799 0.94147 	[0.94172]
2817 0.95143 	[0.95153]
2824 0.91623 	[0.91568]
2826 0.93221 	[0.93108]
2831 0.93506 	[0.93468]
2929 0.94267 	[0.9423]
2936 0.92349 	[0.92419]
2938 0.92347 	[0.92289]
2943 0.94231 	[0.94234]
2961 0.94991 	[0.95035]
2968 0.91476 	[0.91538]
2970 0.93072 	[0.93034]
2975 0.93355 	[0.93381]
3041 0.94191 	[0.94125]
3048 0.92276 	[0.92271]
3050 0.92272 	[0.9217]
3055 0.94155 	[0.94166]
3073 0.9992 	[0.99922]
3080 0.96264 	[0.96244]
3082 0.97902 	[0.97929]
3087 0.98241 	[0.98264]
3185 0.9904 	[0.99059]
3192 0.96986 	[0.96985]
3194 0.97024 	[0.971]
3199 0.98962 	[0.98978]
3217 0.9976 	[0.99761]
3224 0.96109 	[0.96122]
3226 0.97745 	[0.97779]
3231 0.98083 	[0.98131]
3297 0.9896 	[0.98974]
3304 0.96909 	[0.96902]
3306 0.96945 	[0.96998]
3311 0.98883 	[0.98882]
3329 0.98321 	[0.98341]
3336 0.96287 	[0.96342]
3338 0.96319 	[0.96341]
3343 0.98249 	[0.98281]
3441 0.99041 	[0.99088]
3448 0.9541 	[0.9553]
3450 0.97041 	[0.97053]
3455 0.9737 	[0.97418]
3473 0.98162 	[0.98196]
3480 0.96133 	[0.96198]
3482 0.96163 	[0.96195]
3487 0.98092 	[0.98138]
3553 0.98962 	[0.98999]
3560 0.95333 	[0.95393]
3562 0.96963 	[0.96981]
3567 0.97292 	[0.97362]
3585 0.95123 	[0.95148, 0.95154, 0.95103, 0.95167, 0.95158, 0.95104, 0.95147, 0.95099, 0.95201]
3586 0.94005 	[0.94102]
3587 0.95156 	[0.95162]
3588 0.92437 	[0.92421]
3589 0.95866 	[0.95905]
3590 0.93148 	[0.93216]
3591 0.94299 	[0.94381]
3592 0.93181 	[0.93264, 0.93092, 0.9328, 0.93195, 0.93312]
3593 0.95904 	[0.95941]
3594 0.93186 	[0.93225, 0.9307, 0.93205, 0.93236, 0.9337]
3595 0.94337 	[0.94345]
3596 0.93219 	[0.9332]
3597 0.95046 	[0.95082]
3598 0.9393 	[0.93948]
3599 0.9508 	[0.95153, 0.9505, 0.95127, 0.95043, 0.95112]
3600 0.92363 	[0.92336]
3601 0.95027 	[0.9506]
3608 0.93088 	[0.93246]
3610 0.93092 	[0.93133]
3615 0.94985 	[0.95029]
3617 0.95846 	[0.95802]
3624 0.92307 	[0.9233]
3626 0.93911 	[0.93874]
3631 0.94204 	[0.94163]
3633 0.95866 	[0.95912]
3640 0.92325 	[0.92468]
3642 0.9393 	[0.94051]
3647 0.94223 	[0.94247]
3649 0.95104 	[0.95115]
3656 0.93163 	[0.93291]
3658 0.93167 	[0.93232]
3663 0.95061 	[0.95112]
3665 0.95008 	[0.95025]
3672 0.9307 	[0.93246]
3674 0.93073 	[0.93159]
3679 0.94966 	[0.95008]
3681 0.95827 	[0.95843]
3688 0.92288 	[0.92209]
3690 0.93892 	[0.93801]
3695 0.94185 	[0.9408]
3697 0.95846 	[0.95861, 0.95866, 0.95819, 0.95868, 0.95778]
3698 0.9313 	[0.93153]
3699 0.9428 	[0.94284]
3700 0.93163 	[0.93041]
3701 0.94989 	[0.94942]
3702 0.93873 	[0.93963]
3703 0.95023 	[0.95076]
3704 0.92307 	[0.92418]
3705 0.95027 	[0.95084]
3706 0.93911 	[0.93923]
3707 0.95061 	[0.95025]
3708 0.92344 	[0.92447]
3709 0.9577 	[0.95712]
3710 0.93054 	[0.93136]
3711 0.94204 	[0.94259]
3712 0.93088 	[0.92969]
3713 0.95066 	[0.95036]
3720 0.93125 	[0.9314]
3722 0.9313 	[0.9316]
3727 0.95023 	[0.94969]
3729 0.9497 	[0.94999, 0.95036, 0.94905, 0.9499, 0.94947]
3730 0.93854 	[0.93912]
3731 0.95004 	[0.94955]
3732 0.92288 	[0.92197]
3733 0.95712 	[0.95731]
3734 0.92998 	[0.9303]
3735 0.94147 	[0.94225]
3736 0.93032 	[0.93107]
3737 0.95751 	[0.95747]
3738 0.93036 	[0.93043]
3739 0.94185 	[0.9417]
3740 0.9307 	[0.93185]
3741 0.94893 	[0.94881]
3742 0.93779 	[0.93844]
3743 0.94928 	[0.95003]
3744 0.92214 	[0.9213]
3745 0.95789 	[0.95722]
3752 0.92251 	[0.92265]
3754 0.93854 	[0.93761]
3759 0.94147 	[0.94083]
3761 0.95808 	[0.95825]
3768 0.9227 	[0.92199]
3770 0.93873 	[0.93805]
3775 0.94166 	[0.94085]
3777 0.95046 	[0.95106]
3784 0.93107 	[0.93288]
3786 0.93111 	[0.9328]
3791 0.95004 	[0.95002]
3793 0.94951 	[0.94881]
3800 0.93014 	[0.92922]
3802 0.93017 	[0.92872]
3807 0.94909 	[0.94873]
3809 0.9577 	[0.95791, 0.95816, 0.95778, 0.95781, 0.95766]
3810 0.93054 	[0.93104]
3811 0.94204 	[0.94193]
3812 0.93088 	[0.93056]
3813 0.94912 	[0.9493]
3814 0.93798 	[0.93955]
3815 0.94947 	[0.95]
3816 0.92232 	[0.92291]
3817 0.94951 	[0.94962]
3818 0.93836 	[0.93853]
3819 0.94985 	[0.94959]
3820 0.9227 	[0.92437]
3821 0.95693 	[0.95713]
3822 0.92979 	[0.93113]
3823 0.94128 	[0.9418]
3824 0.93014 	[0.92989]
3825 0.95789 	[0.95791]
3832 0.92251 	[0.92411]
3834 0.93854 	[0.93945]
3839 0.94147 	[0.94162]
3841 0.95124 	[0.95228]
3848 0.91605 	[0.91584]
3850 0.93203 	[0.93288]
3855 0.93487 	[0.9351]
3953 0.94248 	[0.94266]
3960 0.92331 	[0.92275]
3962 0.92328 	[0.92347]
3967 0.94212 	[0.94212]
3985 0.94972 	[0.95038]
3992 0.91457 	[0.91486]
3994 0.93054 	[0.93119]
3999 0.93337 	[0.93337]
4065 0.94172 	[0.94233]
4072 0.92257 	[0.92296]
4074 0.92254 	[0.92375]
4079 0.94137 	[0.94211]
In [40]:
e1_fidelity_extracted
Out[40]:
12-element Array{Any,1}:
 Any[1, 14, 12, 7, 225, 238, 236, 231, 113, 126  …  2540, 2535, 2417, 2430, 2428, 2423, 2449, 2462, 2460, 2455]
 Any[1, 2, 5, 6, 225, 226, 229, 230, 113, 114  …  2533, 2534, 2417, 2418, 2421, 2422, 2449, 2450, 2453, 2454]
 Any[1, 3, 9, 11, 225, 227, 233, 235, 113, 115  …  2537, 2539, 2417, 2419, 2425, 2427, 2449, 2451, 2457, 2459]
 Any[1, 4, 13, 16, 225, 228, 237, 240, 113, 116  …  2541, 2544, 2417, 2420, 2429, 2432, 2449, 2452, 2461, 2464]
 Any[1, 15, 8, 10, 209, 223, 216, 218, 177, 191  …  2520, 2522, 2481, 2495, 2488, 2490, 2401, 2415, 2408, 2410]
 Any[1, 15, 8, 10, 17, 31, 24, 26, 65, 79  …  2328, 2330, 2369, 2383, 2376, 2378, 2385, 2399, 2392, 2394]
 Any[1, 15, 8, 10, 33, 47, 40, 42, 129, 143  …  2344, 2346, 2433, 2447, 2440, 2442, 2465, 2479, 2472, 2474]
 Any[1, 15, 8, 10, 49, 63, 56, 58, 193, 207  …  2360, 2362, 2497, 2511, 2504, 2506, 2545, 2559, 2552, 2554]
 Any[1, 15, 8, 10, 225, 239, 232, 234, 113, 127  …  1768, 1770, 1649, 1663, 1656, 1658, 1681, 1695, 1688, 1690]
 Any[1, 15, 8, 10, 225, 239, 232, 234, 113, 127  …  1512, 1514, 1393, 1407, 1400, 1402, 1425, 1439, 1432, 1434]
 Any[1, 15, 8, 10, 225, 239, 232, 234, 113, 127  …  2792, 2794, 2673, 2687, 2680, 2682, 2705, 2719, 2712, 2714]
 Any[1, 15, 8, 10, 225, 239, 232, 234, 113, 127  …  4072, 4074, 3953, 3967, 3960, 3962, 3985, 3999, 3992, 3994]

Okay that is all of exeriment 1 and its offsets done.

That leaves experiment 2 (we need two subsampling groups for the peeling decoder)

Recall experiment 2 looks like this:

Experiment 2

In [41]:
experiments[2]
Out[41]:
4-element Array{Tuple{Int64,Int64},1}:
 (1, 3)
 (2, 2)
 (2, 1)
 (1, 3)
In [42]:
# So q1, has single qubit mub, q2&3 4&5 have a double mub experiment type 2 and q6 has a single mub, so lets set that up:

initialGates = generateGates(experiments[2],[circuit1q,circuit2q])
reverseGates = transpose(initialGates) 
# So at this point we have set up one of the 'new' experiments.

experiment2 = []
# Get the actual probabilities.
@showprogress 1 "Lengths: " for m in lengths
    wholeCircuit = reverseGates*noise^m*initialGates*rm*start./64 # Normalising our zs
    probs = [z'*wholeCircuit for z in zs]
    push!(experiment2,probs)
end
Lengths: 100%|██████████████████████████████████████████| Time: 0:01:17
In [43]:
# Generate shot limited stats

cumMatrix = map(cumsum,experiment2)
experiment2_observed = shotSimulator(64,shotsToDo,cumMatrix);
# Fit and extract the fidelities
(params,l, failed) = fitTheFidelities(lengths,experiment2_observed)
experiment2_fidelities = vcat(1,[p[2] for p in params])
Out[43]:
64-element Array{Float64,1}:
 1.0
 0.9718492848668059
 0.9992044876373045
 0.9713105048705868
 0.9981981062996419
 0.9701024595100746
 0.9985907483606172
 0.9705630223407407
 0.9517775779441707
 0.9245733330744327
 0.9512562769031969
 0.9242646661745776
 0.950147357210411
 ⋮
 0.9488548623792893
 0.9213777820094172
 0.9497054630585438
 0.9218973870298669
 0.9822344066885326
 0.9548523762940008
 0.981512539397869
 0.9541915237156421
 0.9803364849752244
 0.9533235904595504
 0.980831313001577
 0.953703278609339
In [44]:
xx = ifwht_natural([z'*reverseGates*noise*initialGates*start for z in zs]/64)
for p in xx
    print("$p : $(findall(x->isapprox(x,p),actualOracle))\n")
end
0.9999999999999999 : [1]
0.972 : [2, 4]
0.9992001200000001 : [57, 1153, 2113]
0.9712225166 : [60, 1156, 2116]
0.9982008000000001 : [29, 209, 3089]
0.9702511776000001 : [32, 212, 3092]
0.9986004800000002 : [37, 141, 2061, 2241, 3201]
0.9706396666000001 : [40, 144, 2064, 2244, 3204]
0.9518095999999999 : [833, 1793]
0.9251589311999998 : [836, 1796]
0.9510482665 : [889, 1849]
0.9244189151 : [892, 1852]
0.9500971041999999 : [861, 1821, 2001, 3921]
0.9234943852999999 : [864, 1824, 2004, 3924]
0.9504775233999999 : [869, 937, 1017, 1829, 1933, 2857, 2893, 3897]
0.9238641527999998 : [872, 940, 1020, 1832, 1936, 2860, 2896, 3900]
0.951232 : [525, 705, 3585]
0.9243735040000001 : [528, 708, 3588]
0.9504647295000003 : [613, 681, 761, 1573, 1677, 2601, 2637, 3641]
0.9236277171000001 : [616, 684, 764, 1576, 1680, 2604, 2640, 3644]
0.9495061498000001 : [733, 3613, 3793]
0.9226959776000001 : [736, 3616, 3796]
0.9498895356000002 : [741, 2765, 3621, 3725]
0.9230686286 : [744, 2768, 3624, 3728]
0.9834048 : [131, 385, 2051, 2305]
0.9556454655999999 : [388, 2308]
0.9826117951 : [119, 187, 373, 441, 1079, 1131, 1333, 1385, 2107, 2361]
0.9548746649000001 : [376, 444, 1336, 1388, 2364]
0.9816210645000001 : [159, 413, 2079, 2259, 2333, 2513, 3219, 3473]
0.9539116747 : [416, 2336, 2516, 3476]
0.9820173092000001 : [167, 247, 421, 501, 1259, 1513, 2087, 2191, 2341, 2445, 3127, 3179, 3381, 3433]
0.9542968245 : [424, 504, 1516, 2344, 2448, 3384, 3436]
0.9992 : [13, 193, 3073]
0.9712224 : [16, 196, 3076]
0.9984007599000001 : [101, 169, 249, 1061, 1165, 2089, 2125, 3129]
0.9704455386 : [104, 172, 252, 1064, 1168, 2092, 2128, 3132]
0.9974022394000001 : [221, 3101, 3281]
0.9694749767 : [224, 3104, 3284]
0.9978015996000001 : [229, 2253, 3109, 3213]
0.9698631547999998 : [232, 2256, 3112, 3216]
0.9510481523000002 : [809, 845, 1805, 1985, 3905]
0.9244188041 : [812, 848, 1808, 1988, 3908]
0.9502874279000002 : [949, 1893, 1961, 2041, 2869, 2921, 3961]
0.9236793799000003 : [952, 1896, 1964, 2044, 2872, 2924, 3964]
0.9493370265000001 : [2013, 3933]
0.9227555897000002 : [2016, 3936]
0.9497171414 : [2021, 3049, 3941, 4009, 4089]
0.9231250615 : [2024, 3052, 3944, 4012, 4092]
0.9504646143999999 : [717, 3597, 3777]
0.9236276051999999 : [720, 3600, 3780]
0.9496979577000001 : [1765, 2793, 3685, 3753, 3833]
0.9228824149 : [1768, 2796, 3688, 3756, 3836]
0.9487401449000001 : [3805]
0.9219514208000001 : [3808]
0.9491232240000002 : [3813]
0.9223237736999998 : [3816]
0.9826116762000001 : [39, 143, 293, 397, 2063, 2243, 2317, 2497, 3203, 3457]
0.9548745492000003 : [296, 400, 2320, 2500, 3460]
0.9818193057000002 : [1191, 1271, 1445, 1525, 2151, 2219, 2299, 2405, 2473, 2553, 3191, 3259, 3445, 3513]
0.9541043651000002 : [1448, 1528, 2408, 2476, 2556, 3448, 3516]
0.9808293676000002 : [2271, 2525, 3231, 3485]
0.9531421453000002 : [2528, 3488]
0.9812252953000004 : [2279, 2533, 3239, 3319, 3493, 3573]
0.9535269870000002 : [2536, 3496, 3576]
In [45]:
fidelitiesExtracted = []
for x1b in potentialSingles[experiments[2][4][2]] # Experiment 2, qubit division 3 (q1, q2&3, q4) the second part = exp number
    p6 = binaryArrayToNumber(x1b)
    for x2b in all2QlMuBs[experiments[2][3][2]]
        p45 = binaryArrayToNumber(x2b)
        for x2a in all2QlMuBs[experiments[2][2][2]]
            p23 = binaryArrayToNumber(x2a)
            for x1a in potentialSingles[experiments[2][1][2]]
                p1 = binaryArrayToNumber(x1a)
                push!(fidelitiesExtracted,p6*4^5+p45*4^3+p23*4^1+p1+1) # + 1 cause we index of 1 in Julia
             end
        end
    end
end
map(x->fidelityLabels(x-1,qubits=6),fidelitiesExtracted)
Out[45]:
64-element Array{String,1}:
 "IIIIII"
 "IIIIIZ"
 "IIIZYI"
 "IIIZYZ"
 "IIIXZI"
 "IIIXZZ"
 "IIIYXI"
 "IIIYXZ"
 "IZXIII"
 "IZXIIZ"
 "IZXZYI"
 "IZXZYZ"
 "IZXXZI"
 ⋮
 "ZYZXZI"
 "ZYZXZZ"
 "ZYZYXI"
 "ZYZYXZ"
 "ZXYIII"
 "ZXYIIZ"
 "ZXYZYI"
 "ZXYZYZ"
 "ZXYXZI"
 "ZXYXZZ"
 "ZXYYXI"
 "ZXYYXZ"
In [46]:
# And (just out of interest we can compare the fidelities we extracted, with the actual values)
for i in 1:64
    print("$(fidelityLabels(fidelitiesExtracted[i]-1,qubits=6)) ",
        "($(fidelitiesExtracted[i]-1)):\tEstimate: $(experiment2_fidelities[i]) ",
        "<-> $(actualOracle[fidelitiesExtracted[i]]) \t",
        "Percentage Error: $(round.(abs(actualOracle[fidelitiesExtracted[i]]-experiment2_fidelities[i])/(actualOracle[fidelitiesExtracted[i]])*100,digits=4))%\n")
end
IIIIII (0):	Estimate: 1.0 <-> 1.0 	Percentage Error: 0.0%
IIIIIZ (3):	Estimate: 0.9718492848668059 <-> 0.972 	Percentage Error: 0.0155%
IIIZYI (56):	Estimate: 0.9992044876373045 <-> 0.99920012 	Percentage Error: 0.0004%
IIIZYZ (59):	Estimate: 0.9713105048705868 <-> 0.9712225166 	Percentage Error: 0.0091%
IIIXZI (28):	Estimate: 0.9981981062996419 <-> 0.9982008 	Percentage Error: 0.0003%
IIIXZZ (31):	Estimate: 0.9701024595100746 <-> 0.9702511776 	Percentage Error: 0.0153%
IIIYXI (36):	Estimate: 0.9985907483606172 <-> 0.99860048 	Percentage Error: 0.001%
IIIYXZ (39):	Estimate: 0.9705630223407407 <-> 0.9706396666 	Percentage Error: 0.0079%
IZXIII (832):	Estimate: 0.9517775779441707 <-> 0.9518096 	Percentage Error: 0.0034%
IZXIIZ (835):	Estimate: 0.9245733330744327 <-> 0.9251589312 	Percentage Error: 0.0633%
IZXZYI (888):	Estimate: 0.9512562769031969 <-> 0.9510482665 	Percentage Error: 0.0219%
IZXZYZ (891):	Estimate: 0.9242646661745776 <-> 0.9244189151 	Percentage Error: 0.0167%
IZXXZI (860):	Estimate: 0.950147357210411 <-> 0.9500971042 	Percentage Error: 0.0053%
IZXXZZ (863):	Estimate: 0.9228510771672417 <-> 0.9234943853 	Percentage Error: 0.0697%
IZXYXI (868):	Estimate: 0.9509550062603832 <-> 0.9504775234 	Percentage Error: 0.0502%
IZXYXZ (871):	Estimate: 0.9234245081258412 <-> 0.9238641528 	Percentage Error: 0.0476%
IYZIII (704):	Estimate: 0.9509693776758223 <-> 0.951232 	Percentage Error: 0.0276%
IYZIIZ (707):	Estimate: 0.9237571617539286 <-> 0.924373504 	Percentage Error: 0.0667%
IYZZYI (760):	Estimate: 0.9506968275641192 <-> 0.9504647295 	Percentage Error: 0.0244%
IYZZYZ (763):	Estimate: 0.9235819112253988 <-> 0.9236277171 	Percentage Error: 0.005%
IYZXZI (732):	Estimate: 0.9496102218055714 <-> 0.9495061498 	Percentage Error: 0.011%
IYZXZZ (735):	Estimate: 0.9221314737014048 <-> 0.9226959776 	Percentage Error: 0.0612%
IYZYXI (740):	Estimate: 0.9504016799520018 <-> 0.9498895356 	Percentage Error: 0.0539%
IYZYXZ (743):	Estimate: 0.9226123873106471 <-> 0.9230686286 	Percentage Error: 0.0494%
IXYIII (384):	Estimate: 0.9831294394007184 <-> 0.9834048 	Percentage Error: 0.028%
IXYIIZ (387):	Estimate: 0.9557896292882174 <-> 0.9556454656 	Percentage Error: 0.0151%
IXYZYI (440):	Estimate: 0.9824020341673697 <-> 0.9826117951 	Percentage Error: 0.0213%
IXYZYZ (443):	Estimate: 0.9551261143540881 <-> 0.9548746649 	Percentage Error: 0.0263%
IXYXZI (412):	Estimate: 0.9812769722228375 <-> 0.9816210645 	Percentage Error: 0.0351%
IXYXZZ (415):	Estimate: 0.9542270964263432 <-> 0.9539116747 	Percentage Error: 0.0331%
IXYYXI (420):	Estimate: 0.9817477059842559 <-> 0.9820173092 	Percentage Error: 0.0275%
IXYYXZ (423):	Estimate: 0.9545996003068581 <-> 0.9542968245 	Percentage Error: 0.0317%
ZIIIII (3072):	Estimate: 0.9991867343374298 <-> 0.9992 	Percentage Error: 0.0013%
ZIIIIZ (3075):	Estimate: 0.9710457018641796 <-> 0.9712224 	Percentage Error: 0.0182%
ZIIZYI (3128):	Estimate: 0.998388910567334 <-> 0.9984007599 	Percentage Error: 0.0012%
ZIIZYZ (3131):	Estimate: 0.9705023495825441 <-> 0.9704455386 	Percentage Error: 0.0059%
ZIIXZI (3100):	Estimate: 0.9973797355262007 <-> 0.9974022394 	Percentage Error: 0.0023%
ZIIXZZ (3103):	Estimate: 0.9693171746739611 <-> 0.9694749767 	Percentage Error: 0.0163%
ZIIYXI (3108):	Estimate: 0.9977773222629968 <-> 0.9978015996 	Percentage Error: 0.0024%
ZIIYXZ (3111):	Estimate: 0.9697209834263271 <-> 0.9698631548 	Percentage Error: 0.0147%
ZZXIII (3904):	Estimate: 0.9508638020616145 <-> 0.9510481523 	Percentage Error: 0.0194%
ZZXIIZ (3907):	Estimate: 0.9235900658142702 <-> 0.9244188041 	Percentage Error: 0.0896%
ZZXZYI (3960):	Estimate: 0.9503518534725851 <-> 0.9502874279 	Percentage Error: 0.0068%
ZZXZYZ (3963):	Estimate: 0.9233633889010356 <-> 0.9236793799 	Percentage Error: 0.0342%
ZZXXZI (3932):	Estimate: 0.9493011695175068 <-> 0.9493370265 	Percentage Error: 0.0038%
ZZXXZZ (3935):	Estimate: 0.9219140492893186 <-> 0.9227555897 	Percentage Error: 0.0912%
ZZXYXI (3940):	Estimate: 0.9501258125442994 <-> 0.9497171414 	Percentage Error: 0.043%
ZZXYXZ (3943):	Estimate: 0.9224895346040815 <-> 0.9231250615 	Percentage Error: 0.0688%
ZYZIII (3776):	Estimate: 0.950188874313939 <-> 0.9504646144 	Percentage Error: 0.029%
ZYZIIZ (3779):	Estimate: 0.9229840720963681 <-> 0.9236276052 	Percentage Error: 0.0697%
ZYZZYI (3832):	Estimate: 0.9499311032332632 <-> 0.9496979577 	Percentage Error: 0.0245%
ZYZZYZ (3835):	Estimate: 0.9228624353752058 <-> 0.9228824149 	Percentage Error: 0.0022%
ZYZXZI (3804):	Estimate: 0.9488548623792893 <-> 0.9487401449 	Percentage Error: 0.0121%
ZYZXZZ (3807):	Estimate: 0.9213777820094172 <-> 0.9219514208 	Percentage Error: 0.0622%
ZYZYXI (3812):	Estimate: 0.9497054630585438 <-> 0.949123224 	Percentage Error: 0.0613%
ZYZYXZ (3815):	Estimate: 0.9218973870298669 <-> 0.9223237737 	Percentage Error: 0.0462%
ZXYIII (3456):	Estimate: 0.9822344066885326 <-> 0.9826116762 	Percentage Error: 0.0384%
ZXYIIZ (3459):	Estimate: 0.9548523762940008 <-> 0.9548745492 	Percentage Error: 0.0023%
ZXYZYI (3512):	Estimate: 0.981512539397869 <-> 0.9818193057 	Percentage Error: 0.0312%
ZXYZYZ (3515):	Estimate: 0.9541915237156421 <-> 0.9541043651 	Percentage Error: 0.0091%
ZXYXZI (3484):	Estimate: 0.9803364849752244 <-> 0.9808293676 	Percentage Error: 0.0503%
ZXYXZZ (3487):	Estimate: 0.9533235904595504 <-> 0.9531421453 	Percentage Error: 0.019%
ZXYYXI (3492):	Estimate: 0.980831313001577 <-> 0.9812252953 	Percentage Error: 0.0402%
ZXYYXZ (3495):	Estimate: 0.953703278609339 <-> 0.953526987 	Percentage Error: 0.0185%
In [47]:
# Save you the sanity check and I will put these directly into the oracle

for (fidelityIndex,fidelity)  in enumerate(fidelitiesExtracted)
    push!(estimateOracle[fidelity],experiment2_fidelities[fidelityIndex])
end
In [48]:
# our oracle so far
for i in 1:4096
    if estimateOracle[i]!=[]
        print("$(string(i,pad=3)) $(round.(actualOracle[i],digits=5)) \t$(round.(estimateOracle[i],digits=5))\n")
    end
end
001 1.0 	[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
002 0.972 	[0.97245]
003 0.984 	[0.98379]
004 0.972 	[0.97222, 0.97185]
005 0.9914 	[0.99144]
006 0.97941 	[0.97957]
007 0.9914 	[0.99154]
008 0.96342 	[0.96466, 0.96398, 0.96294, 0.96325, 0.9632, 0.96397, 0.96343, 0.96334, 0.96326]
009 0.9918 	[0.99173]
010 0.9798 	[0.98004, 0.98015, 0.97996, 0.97959, 0.97987, 0.9798, 0.97983, 0.97964, 0.98009]
011 0.9918 	[0.99173]
012 0.96381 	[0.96422]
013 0.9992 	[0.99924]
014 0.97122 	[0.97099]
015 0.98321 	[0.98326, 0.98301, 0.98303, 0.98328, 0.98312, 0.98329, 0.98322, 0.98338, 0.98344]
016 0.97122 	[0.97164]
017 0.999 	[0.99905]
024 0.96245 	[0.9621]
026 0.97882 	[0.97902]
029 0.9982 	[0.9982]
031 0.98222 	[0.98198]
032 0.97025 	[0.9701]
033 0.9912 	[0.99122]
037 0.9986 	[0.99859]
040 0.97064 	[0.96982, 0.97056]
042 0.97102 	[0.97078]
047 0.99041 	[0.99034]
049 0.9914 	[0.99143]
056 0.97083 	[0.97072]
057 0.9992 	[0.9992]
058 0.97122 	[0.97188]
060 0.97122 	[0.97131]
063 0.99061 	[0.99061]
065 0.9998 	[0.99983]
072 0.96322 	[0.96257]
074 0.97961 	[0.97967]
079 0.98301 	[0.98293]
081 0.9988 	[0.99889]
088 0.96225 	[0.96175]
090 0.97863 	[0.97867]
095 0.98202 	[0.9819]
097 0.991 	[0.99091]
104 0.97045 	[0.97043]
106 0.97082 	[0.97099]
111 0.99022 	[0.99018]
113 0.9912 	[0.9911, 0.99132, 0.99138, 0.99103, 0.99118, 0.99143, 0.99122, 0.99126, 0.99144]
114 0.97922 	[0.97945]
115 0.99121 	[0.99099]
116 0.96322 	[0.96393]
117 0.9986 	[0.99861]
118 0.97063 	[0.97115]
119 0.98261 	[0.98272]
120 0.97064 	[0.97126, 0.9713, 0.97094, 0.97077, 0.97058]
121 0.999 	[0.99898]
122 0.97102 	[0.97163, 0.97103, 0.97088, 0.97107, 0.97178]
123 0.98301 	[0.98273]
124 0.97103 	[0.97133]
125 0.9904 	[0.99044]
126 0.97843 	[0.97817]
127 0.99041 	[0.99052, 0.99049, 0.99049, 0.9904, 0.99054]
128 0.96245 	[0.96317]
129 0.9994 	[0.99942]
136 0.96283 	[0.96246]
138 0.97922 	[0.9786]
143 0.98261 	[0.98276]
145 0.9984 	[0.99839, 0.99843, 0.99843, 0.99828, 0.99835, 0.99856, 0.99835, 0.9983, 0.99838]
146 0.97043 	[0.97101]
147 0.98241 	[0.98191]
148 0.97045 	[0.97054]
149 0.9898 	[0.98999]
150 0.97785 	[0.9781]
151 0.98982 	[0.98993]
152 0.96186 	[0.96303, 0.96228, 0.96186, 0.9618, 0.96182]
153 0.9902 	[0.99001]
154 0.97824 	[0.97831, 0.97833, 0.97787, 0.97816, 0.9784]
155 0.99022 	[0.98994]
156 0.96225 	[0.96273]
157 0.9976 	[0.99757]
158 0.96965 	[0.96954]
159 0.98162 	[0.98164, 0.98198, 0.98148, 0.98163, 0.98207]
160 0.96967 	[0.97002]
161 0.9906 	[0.99076]
168 0.97006 	[0.96884]
170 0.97043 	[0.96998]
175 0.98982 	[0.98971]
177 0.9908 	[0.99058]
184 0.97025 	[0.97032]
186 0.97063 	[0.97082]
191 0.99002 	[0.98983]
193 0.9992 	[0.99919]
200 0.96264 	[0.96275]
202 0.97902 	[0.97906]
207 0.98241 	[0.98238]
209 0.9982 	[0.99823]
216 0.96167 	[0.96222]
218 0.97804 	[0.9784]
223 0.98142 	[0.98127]
225 0.9904 	[0.99038, 0.99055, 0.99058, 0.99016, 0.99042, 0.99059, 0.99034, 0.99046, 0.99057]
226 0.97843 	[0.97859]
227 0.99041 	[0.99021]
228 0.96245 	[0.96336]
229 0.9978 	[0.99777]
230 0.96985 	[0.97035]
231 0.98182 	[0.98204]
232 0.96986 	[0.97036, 0.97029, 0.96985, 0.97001, 0.96985]
233 0.9982 	[0.99811]
234 0.97024 	[0.9708, 0.97016, 0.97003, 0.96999, 0.97078]
235 0.98222 	[0.9818]
236 0.97025 	[0.97049]
237 0.9896 	[0.98967]
238 0.97765 	[0.97746]
239 0.98962 	[0.98974, 0.98969, 0.9898, 0.98967, 0.98959]
240 0.96167 	[0.96259]
241 0.9906 	[0.99058]
248 0.97006 	[0.97013]
250 0.97043 	[0.97112]
255 0.98982 	[0.98974]
257 0.984 	[0.98416]
264 0.96365 	[0.96451]
266 0.96397 	[0.96369]
271 0.98328 	[0.98343]
369 0.99121 	[0.99107]
376 0.95487 	[0.95525]
378 0.97119 	[0.97133]
383 0.97449 	[0.97444]
385 0.9834 	[0.98313]
388 0.95565 	[0.95579]
401 0.98241 	[0.98228]
408 0.9621 	[0.96265]
410 0.96241 	[0.96167]
413 0.98162 	[0.98128]
415 0.9817 	[0.98183]
416 0.95391 	[0.95423]
421 0.98202 	[0.98175]
424 0.9543 	[0.9546]
441 0.98261 	[0.9824]
444 0.95487 	[0.95513]
481 0.99041 	[0.9903]
488 0.9541 	[0.95443]
490 0.97041 	[0.97051]
495 0.9737 	[0.97375]
513 0.952 	[0.95181]
520 0.93256 	[0.93235]
522 0.93261 	[0.93229]
527 0.95156 	[0.95112]
625 0.95923 	[0.95877]
632 0.92381 	[0.92436]
634 0.93986 	[0.93907]
639 0.9428 	[0.94314]
657 0.95046 	[0.95032]
664 0.93107 	[0.93085]
666 0.93111 	[0.93106]
671 0.95004 	[0.94956]
705 0.95123 	[0.95097]
708 0.92437 	[0.92376]
733 0.94951 	[0.94961]
736 0.9227 	[0.92213]
737 0.95846 	[0.95808]
741 0.94989 	[0.9504]
744 0.92307 	[0.92348, 0.92261]
746 0.93911 	[0.93804]
751 0.94204 	[0.94229]
761 0.95046 	[0.9507]
764 0.92363 	[0.92358]
769 0.952 	[0.95314]
776 0.91679 	[0.91672]
778 0.93277 	[0.9337]
783 0.93563 	[0.93591]
833 0.95181 	[0.95178]
836 0.92516 	[0.92457]
861 0.9501 	[0.95015]
864 0.92349 	[0.92285]
869 0.95048 	[0.95096]
872 0.92386 	[0.92342]
881 0.94324 	[0.94355]
888 0.92405 	[0.9236]
889 0.95105 	[0.95126]
890 0.92403 	[0.92427]
892 0.92442 	[0.92426]
895 0.94287 	[0.94297]
913 0.95048 	[0.95116]
920 0.91531 	[0.91568]
922 0.93128 	[0.9319]
927 0.93412 	[0.93411]
993 0.94248 	[0.94322]
1000 0.92331 	[0.92388]
1002 0.92328 	[0.92456]
1007 0.94212 	[0.943]
1025 0.9998 	[0.99981]
1032 0.96322 	[0.96318]
1034 0.97961 	[0.97964]
1039 0.98301 	[0.98303]
1137 0.991 	[0.99103]
1144 0.97045 	[0.97076]
1146 0.97082 	[0.97067]
1151 0.99022 	[0.99032]
1169 0.9982 	[0.99815]
1176 0.96167 	[0.96161]
1178 0.97804 	[0.97769]
1183 0.98142 	[0.98127]
1249 0.9902 	[0.99013]
1256 0.96967 	[0.96964]
1258 0.97004 	[0.96975]
1263 0.98942 	[0.98961]
1281 0.9838 	[0.98392]
1288 0.96345 	[0.96429]
1290 0.96377 	[0.96347]
1295 0.98308 	[0.98325]
1393 0.99101 	[0.99088]
1400 0.95468 	[0.95503]
1402 0.97099 	[0.97113]
1407 0.97429 	[0.97416]
1425 0.98222 	[0.98206]
1432 0.96191 	[0.96242]
1434 0.96222 	[0.96148]
1439 0.98151 	[0.98164]
1505 0.99022 	[0.99009]
1512 0.95391 	[0.95415]
1514 0.97022 	[0.97026]
1519 0.97351 	[0.97342]
1537 0.95181 	[0.95153]
1544 0.93237 	[0.93249]
1546 0.93242 	[0.93138]
1551 0.95137 	[0.9516]
1649 0.95904 	[0.95917]
1656 0.92363 	[0.92448]
1658 0.93967 	[0.93918]
1663 0.94261 	[0.94289]
1681 0.95027 	[0.94976]
1688 0.93088 	[0.93198]
1690 0.93092 	[0.93027]
1695 0.94985 	[0.9503]
1761 0.95827 	[0.95832]
1768 0.92288 	[0.92245]
1770 0.93892 	[0.93812]
1775 0.94185 	[0.94178]
1793 0.95181 	[0.95238, 0.95156, 0.95179, 0.95135, 0.95203, 0.9521, 0.9519, 0.95136, 0.95258]
1794 0.92477 	[0.92446]
1795 0.9362 	[0.93528]
1796 0.92516 	[0.92545]
1797 0.94324 	[0.94294]
1798 0.93221 	[0.93232]
1799 0.94363 	[0.94354]
1800 0.9166 	[0.91648, 0.91646, 0.91656, 0.91784, 0.9183]
1801 0.94362 	[0.94319]
1802 0.93259 	[0.93246, 0.93224, 0.93296, 0.93349, 0.93417]
1803 0.94401 	[0.94303]
1804 0.91697 	[0.91792]
1805 0.95105 	[0.95141]
1806 0.92403 	[0.92447]
1807 0.93544 	[0.9354, 0.93484, 0.93514, 0.93512, 0.93569]
1808 0.92442 	[0.9247]
1809 0.95086 	[0.951]
1816 0.91568 	[0.91548]
1818 0.93165 	[0.93214]
1823 0.9345 	[0.93388]
1825 0.94305 	[0.94298]
1832 0.92386 	[0.92431]
1834 0.92384 	[0.92486]
1839 0.94269 	[0.94205]
1841 0.94324 	[0.94415]
1848 0.92405 	[0.92538]
1850 0.92403 	[0.92629]
1855 0.94287 	[0.94276]
1857 0.95162 	[0.9515]
1864 0.91642 	[0.91693]
1866 0.9324 	[0.93311]
1871 0.93525 	[0.93491]
1873 0.95067 	[0.95053]
1880 0.91549 	[0.91576]
1882 0.93147 	[0.93215]
1887 0.93431 	[0.9336]
1889 0.94286 	[0.94235]
1896 0.92368 	[0.92356]
1898 0.92365 	[0.92281]
1903 0.9425 	[0.94212]
1905 0.94305 	[0.94339, 0.94273, 0.9423, 0.9423, 0.94314]
1906 0.93203 	[0.93152]
1907 0.94344 	[0.94238]
1908 0.91642 	[0.91613]
1909 0.95048 	[0.95032]
1910 0.92347 	[0.92295]
1911 0.93487 	[0.93504]
1912 0.92386 	[0.92416]
1913 0.95086 	[0.95001]
1914 0.92384 	[0.92337]
1915 0.93525 	[0.93367]
1916 0.92423 	[0.92525]
1917 0.94229 	[0.9425]
1918 0.93128 	[0.93184]
1919 0.94269 	[0.94333]
1920 0.91568 	[0.91546]
1921 0.95124 	[0.95061]
1928 0.91605 	[0.91689]
1930 0.93203 	[0.93251]
1935 0.93487 	[0.93434]
1937 0.95029 	[0.9507, 0.95027, 0.94995, 0.94922, 0.94962]
1938 0.92328 	[0.92244]
1939 0.93468 	[0.93311]
1940 0.92368 	[0.92291]
1941 0.94172 	[0.94114]
1942 0.93072 	[0.93039]
1943 0.94212 	[0.94191]
1944 0.91513 	[0.9153]
1945 0.9421 	[0.94118]
1946 0.9311 	[0.9307]
1947 0.9425 	[0.94089]
1948 0.91549 	[0.91673]
1949 0.94953 	[0.94918]
1950 0.92254 	[0.92334]
1951 0.93393 	[0.93375]
1952 0.92294 	[0.92233]
1953 0.94248 	[0.94202]
1960 0.92331 	[0.92313]
1962 0.92328 	[0.92369]
1967 0.94212 	[0.94103]
1969 0.94267 	[0.94262]
1976 0.92349 	[0.92362]
1978 0.92347 	[0.92327]
1983 0.94231 	[0.94203]
1985 0.95105 	[0.95136]
1992 0.91586 	[0.9177]
1994 0.93184 	[0.93347]
1999 0.93469 	[0.93516]
2001 0.9501 	[0.95018]
2008 0.91494 	[0.91412]
2010 0.93091 	[0.93001]
2015 0.93374 	[0.93245]
2017 0.94229 	[0.94288, 0.94269, 0.94172, 0.94149, 0.94266]
2018 0.93128 	[0.93103]
2019 0.94269 	[0.94167]
2020 0.91568 	[0.91585]
2021 0.94972 	[0.94985]
2022 0.92272 	[0.92263]
2023 0.93412 	[0.93443]
2024 0.92313 	[0.92311]
2025 0.9501 	[0.94919]
2026 0.9231 	[0.92261]
2027 0.9345 	[0.93292]
2028 0.92349 	[0.92483]
2029 0.94153 	[0.942]
2030 0.93054 	[0.93122]
2031 0.94193 	[0.94241]
2032 0.91494 	[0.91519]
2033 0.94248 	[0.94316]
2040 0.92331 	[0.92479]
2042 0.92328 	[0.92505]
2047 0.94212 	[0.94181]
2049 0.9994 	[0.99943]
2056 0.96283 	[0.96281]
2058 0.97922 	[0.97906]
2063 0.98261 	[0.98284]
2161 0.9906 	[0.99071]
2168 0.97006 	[0.97017]
2170 0.97043 	[0.97057]
2175 0.98982 	[0.98989]
2193 0.9978 	[0.99772]
2200 0.96128 	[0.96122]
2202 0.97765 	[0.97763]
2207 0.98103 	[0.98104]
2273 0.9898 	[0.98994]
2280 0.96928 	[0.96948]
2282 0.96965 	[0.96948]
2287 0.98903 	[0.98914]
2305 0.9834 	[0.98339, 0.98366, 0.98343, 0.98324, 0.98356, 0.98321, 0.98309, 0.98378, 0.98364]
2306 0.97158 	[0.9719]
2307 0.98347 	[0.98351]
2308 0.95565 	[0.95684]
2309 0.99081 	[0.9906]
2310 0.96299 	[0.96299]
2311 0.97488 	[0.97548]
2312 0.96307 	[0.96325, 0.96355, 0.9629, 0.96307, 0.96264]
2313 0.99121 	[0.99128]
2314 0.96338 	[0.96376, 0.96409, 0.96233, 0.9639, 0.96396]
2315 0.97528 	[0.97523]
2316 0.96345 	[0.96479]
2317 0.98261 	[0.98285]
2318 0.9708 	[0.97093]
2319 0.98269 	[0.9828, 0.98256, 0.98264, 0.98297, 0.98268]
2320 0.95487 	[0.95619]
2321 0.98241 	[0.98209]
2328 0.9621 	[0.96197]
2330 0.96241 	[0.96132]
2335 0.9817 	[0.98188]
2337 0.99061 	[0.99079]
2344 0.9543 	[0.9554]
2346 0.97061 	[0.97088]
2351 0.9739 	[0.9742]
2353 0.99081 	[0.9909]
2360 0.95449 	[0.95424]
2362 0.9708 	[0.97116]
2367 0.9741 	[0.97387]
2369 0.98321 	[0.98292]
2376 0.96287 	[0.96273]
2378 0.96319 	[0.96212]
2383 0.98249 	[0.98251]
2385 0.98222 	[0.98198]
2392 0.96191 	[0.96177]
2394 0.96222 	[0.96108]
2399 0.98151 	[0.98182]
2401 0.99041 	[0.99034]
2408 0.9541 	[0.95492]
2410 0.97041 	[0.97084]
2415 0.9737 	[0.9737]
2417 0.99061 	[0.99064, 0.99078, 0.99049, 0.99059, 0.99054]
2418 0.9628 	[0.96333]
2419 0.97469 	[0.97434]
2420 0.96287 	[0.96322]
2421 0.98202 	[0.98196]
2422 0.97022 	[0.97052]
2423 0.9821 	[0.98276]
2424 0.9543 	[0.95555]
2425 0.98241 	[0.98215]
2426 0.97061 	[0.97074]
2427 0.98249 	[0.98255]
2428 0.95468 	[0.95596]
2429 0.98982 	[0.98981]
2430 0.96202 	[0.96254]
2431 0.9739 	[0.97363]
2432 0.9621 	[0.96279]
2433 0.98281 	[0.98317]
2440 0.96249 	[0.96218]
2442 0.9628 	[0.96306]
2447 0.9821 	[0.98227]
2449 0.98182 	[0.98173, 0.98214, 0.98201, 0.98149, 0.98193]
2450 0.97002 	[0.97049]
2451 0.9819 	[0.98197]
2452 0.9541 	[0.95531]
2453 0.98923 	[0.98912]
2454 0.96144 	[0.9618]
2455 0.97331 	[0.97409]
2456 0.96153 	[0.9616]
2457 0.98962 	[0.98958]
2458 0.96183 	[0.96159]
2459 0.9737 	[0.97363]
2460 0.96191 	[0.96384]
2461 0.98103 	[0.98125]
2462 0.96925 	[0.96961]
2463 0.98112 	[0.98115]
2464 0.95333 	[0.95474]
2465 0.99002 	[0.99015]
2472 0.95372 	[0.95451]
2474 0.97002 	[0.9698]
2479 0.97331 	[0.97335]
2481 0.99022 	[0.98989]
2488 0.95391 	[0.95494]
2490 0.97022 	[0.97082]
2495 0.97351 	[0.97329]
2497 0.98261 	[0.98281]
2504 0.9623 	[0.9618]
2506 0.96261 	[0.96332]
2511 0.9819 	[0.98171]
2513 0.98162 	[0.98141]
2520 0.96133 	[0.96158]
2522 0.96163 	[0.96225]
2527 0.98092 	[0.98075]
2529 0.98982 	[0.98988, 0.98995, 0.98975, 0.9898, 0.98975]
2530 0.96202 	[0.96235]
2531 0.9739 	[0.97386]
2532 0.9621 	[0.9629]
2533 0.98123 	[0.98121]
2534 0.96944 	[0.96963]
2535 0.98131 	[0.98193]
2536 0.95353 	[0.95451]
2537 0.98162 	[0.98138]
2538 0.96983 	[0.96987]
2539 0.9817 	[0.98195]
2540 0.95391 	[0.95514]
2541 0.98903 	[0.98907]
2542 0.96125 	[0.96167]
2543 0.97312 	[0.97294]
2544 0.96133 	[0.96237]
2545 0.99002 	[0.99006]
2552 0.95372 	[0.95373]
2554 0.97002 	[0.97015]
2559 0.97331 	[0.97296]
2561 0.95142 	[0.95128]
2568 0.932 	[0.93179]
2570 0.93205 	[0.93218]
2575 0.95099 	[0.9505]
2673 0.95866 	[0.95806]
2680 0.92326 	[0.9237]
2682 0.9393 	[0.93844]
2687 0.94223 	[0.94256]
2705 0.94989 	[0.9496]
2712 0.93051 	[0.93019]
2714 0.93054 	[0.93041]
2719 0.94947 	[0.94886]
2785 0.95789 	[0.95747]
2792 0.92251 	[0.92287]
2794 0.93854 	[0.93755]
2799 0.94147 	[0.94172]
2817 0.95143 	[0.95153]
2824 0.91623 	[0.91568]
2826 0.93221 	[0.93108]
2831 0.93506 	[0.93468]
2929 0.94267 	[0.9423]
2936 0.92349 	[0.92419]
2938 0.92347 	[0.92289]
2943 0.94231 	[0.94234]
2961 0.94991 	[0.95035]
2968 0.91476 	[0.91538]
2970 0.93072 	[0.93034]
2975 0.93355 	[0.93381]
3041 0.94191 	[0.94125]
3048 0.92276 	[0.92271]
3050 0.92272 	[0.9217]
3055 0.94155 	[0.94166]
3073 0.9992 	[0.99922, 0.99919]
3076 0.97122 	[0.97105]
3080 0.96264 	[0.96244]
3082 0.97902 	[0.97929]
3087 0.98241 	[0.98264]
3101 0.9974 	[0.99738]
3104 0.96947 	[0.96932]
3109 0.9978 	[0.99778]
3112 0.96986 	[0.96972]
3129 0.9984 	[0.99839]
3132 0.97045 	[0.9705]
3185 0.9904 	[0.99059]
3192 0.96986 	[0.96985]
3194 0.97024 	[0.971]
3199 0.98962 	[0.98978]
3217 0.9976 	[0.99761]
3224 0.96109 	[0.96122]
3226 0.97745 	[0.97779]
3231 0.98083 	[0.98131]
3297 0.9896 	[0.98974]
3304 0.96909 	[0.96902]
3306 0.96945 	[0.96998]
3311 0.98883 	[0.98882]
3329 0.98321 	[0.98341]
3336 0.96287 	[0.96342]
3338 0.96319 	[0.96341]
3343 0.98249 	[0.98281]
3441 0.99041 	[0.99088]
3448 0.9541 	[0.9553]
3450 0.97041 	[0.97053]
3455 0.9737 	[0.97418]
3457 0.98261 	[0.98223]
3460 0.95487 	[0.95485]
3473 0.98162 	[0.98196]
3480 0.96133 	[0.96198]
3482 0.96163 	[0.96195]
3485 0.98083 	[0.98034]
3487 0.98092 	[0.98138]
3488 0.95314 	[0.95332]
3493 0.98123 	[0.98083]
3496 0.95353 	[0.9537]
3513 0.98182 	[0.98151]
3516 0.9541 	[0.95419]
3553 0.98962 	[0.98999]
3560 0.95333 	[0.95393]
3562 0.96963 	[0.96981]
3567 0.97292 	[0.97362]
3585 0.95123 	[0.95148, 0.95154, 0.95103, 0.95167, 0.95158, 0.95104, 0.95147, 0.95099, 0.95201]
3586 0.94005 	[0.94102]
3587 0.95156 	[0.95162]
3588 0.92437 	[0.92421]
3589 0.95866 	[0.95905]
3590 0.93148 	[0.93216]
3591 0.94299 	[0.94381]
3592 0.93181 	[0.93264, 0.93092, 0.9328, 0.93195, 0.93312]
3593 0.95904 	[0.95941]
3594 0.93186 	[0.93225, 0.9307, 0.93205, 0.93236, 0.9337]
3595 0.94337 	[0.94345]
3596 0.93219 	[0.9332]
3597 0.95046 	[0.95082]
3598 0.9393 	[0.93948]
3599 0.9508 	[0.95153, 0.9505, 0.95127, 0.95043, 0.95112]
3600 0.92363 	[0.92336]
3601 0.95027 	[0.9506]
3608 0.93088 	[0.93246]
3610 0.93092 	[0.93133]
3615 0.94985 	[0.95029]
3617 0.95846 	[0.95802]
3624 0.92307 	[0.9233]
3626 0.93911 	[0.93874]
3631 0.94204 	[0.94163]
3633 0.95866 	[0.95912]
3640 0.92325 	[0.92468]
3642 0.9393 	[0.94051]
3647 0.94223 	[0.94247]
3649 0.95104 	[0.95115]
3656 0.93163 	[0.93291]
3658 0.93167 	[0.93232]
3663 0.95061 	[0.95112]
3665 0.95008 	[0.95025]
3672 0.9307 	[0.93246]
3674 0.93073 	[0.93159]
3679 0.94966 	[0.95008]
3681 0.95827 	[0.95843]
3688 0.92288 	[0.92209]
3690 0.93892 	[0.93801]
3695 0.94185 	[0.9408]
3697 0.95846 	[0.95861, 0.95866, 0.95819, 0.95868, 0.95778]
3698 0.9313 	[0.93153]
3699 0.9428 	[0.94284]
3700 0.93163 	[0.93041]
3701 0.94989 	[0.94942]
3702 0.93873 	[0.93963]
3703 0.95023 	[0.95076]
3704 0.92307 	[0.92418]
3705 0.95027 	[0.95084]
3706 0.93911 	[0.93923]
3707 0.95061 	[0.95025]
3708 0.92344 	[0.92447]
3709 0.9577 	[0.95712]
3710 0.93054 	[0.93136]
3711 0.94204 	[0.94259]
3712 0.93088 	[0.92969]
3713 0.95066 	[0.95036]
3720 0.93125 	[0.9314]
3722 0.9313 	[0.9316]
3727 0.95023 	[0.94969]
3729 0.9497 	[0.94999, 0.95036, 0.94905, 0.9499, 0.94947]
3730 0.93854 	[0.93912]
3731 0.95004 	[0.94955]
3732 0.92288 	[0.92197]
3733 0.95712 	[0.95731]
3734 0.92998 	[0.9303]
3735 0.94147 	[0.94225]
3736 0.93032 	[0.93107]
3737 0.95751 	[0.95747]
3738 0.93036 	[0.93043]
3739 0.94185 	[0.9417]
3740 0.9307 	[0.93185]
3741 0.94893 	[0.94881]
3742 0.93779 	[0.93844]
3743 0.94928 	[0.95003]
3744 0.92214 	[0.9213]
3745 0.95789 	[0.95722]
3752 0.92251 	[0.92265]
3754 0.93854 	[0.93761]
3759 0.94147 	[0.94083]
3761 0.95808 	[0.95825]
3768 0.9227 	[0.92199]
3770 0.93873 	[0.93805]
3775 0.94166 	[0.94085]
3777 0.95046 	[0.95106, 0.95019]
3780 0.92363 	[0.92298]
3784 0.93107 	[0.93288]
3786 0.93111 	[0.9328]
3791 0.95004 	[0.95002]
3793 0.94951 	[0.94881]
3800 0.93014 	[0.92922]
3802 0.93017 	[0.92872]
3805 0.94874 	[0.94885]
3807 0.94909 	[0.94873]
3808 0.92195 	[0.92138]
3809 0.9577 	[0.95791, 0.95816, 0.95778, 0.95781, 0.95766]
3810 0.93054 	[0.93104]
3811 0.94204 	[0.94193]
3812 0.93088 	[0.93056]
3813 0.94912 	[0.9493, 0.94971]
3814 0.93798 	[0.93955]
3815 0.94947 	[0.95]
3816 0.92232 	[0.92291, 0.9219]
3817 0.94951 	[0.94962]
3818 0.93836 	[0.93853]
3819 0.94985 	[0.94959]
3820 0.9227 	[0.92437]
3821 0.95693 	[0.95713]
3822 0.92979 	[0.93113]
3823 0.94128 	[0.9418]
3824 0.93014 	[0.92989]
3825 0.95789 	[0.95791]
3832 0.92251 	[0.92411]
3833 0.9497 	[0.94993]
3834 0.93854 	[0.93945]
3836 0.92288 	[0.92286]
3839 0.94147 	[0.94162]
3841 0.95124 	[0.95228]
3848 0.91605 	[0.91584]
3850 0.93203 	[0.93288]
3855 0.93487 	[0.9351]
3905 0.95105 	[0.95086]
3908 0.92442 	[0.92359]
3933 0.94934 	[0.9493]
3936 0.92276 	[0.92191]
3941 0.94972 	[0.95013]
3944 0.92313 	[0.92249]
3953 0.94248 	[0.94266]
3960 0.92331 	[0.92275]
3961 0.95029 	[0.95035]
3962 0.92328 	[0.92347]
3964 0.92368 	[0.92336]
3967 0.94212 	[0.94212]
3985 0.94972 	[0.95038]
3992 0.91457 	[0.91486]
3994 0.93054 	[0.93119]
3999 0.93337 	[0.93337]
4065 0.94172 	[0.94233]
4072 0.92257 	[0.92296]
4074 0.92254 	[0.92375]
4079 0.94137 	[0.94211]

And super finally we just need to create the offsets to get their fidelities.

For the single qubit Mubs, we need the other 2 mubs, for the two qubit Mubs the other four.

In [49]:
e2_all_additional_fidelities = []
e2_fidelity_extracted = []
# Note here I am hard coding that we have single qubit twirls on 1 and 4.
e_count = 0
e2_all_actualProbabilities = []
@showprogress 1 "Qubits: " for qubitOn in 1:4 # qubits we can be "On" here are 1, 2&3 4&5 and 6
    
    if qubitOn == 2 || qubitOn == 3
        noOfExperiments = 5
    else 
        noOfExperiments = 3 # Only 3 if its a single qubit.
    end
    for experimentType = 1:noOfExperiments
        if experiments[2][qubitOn][2] != experimentType
            # Its one we haven't done
            expOnFirstSet = experiments[2][1][2]
            expOnSecondSet = experiments[2][2][2]
            expOnThirdSet = experiments[2][3][2]
            expOnFourthSet = experiments[2][4][2]


            if qubitOn == 1
                expOnFirstSet = experimentType
            elseif qubitOn == 2
                expOnSecondSet = experimentType
            elseif qubitOn ==3 
                expOnThirdSet = experimentType
            else 
                expOnFourthSet = experimentType
            end
                
            initialGates = circuit1q[expOnFourthSet]⊗circuit2q[expOnThirdSet]⊗circuit2q[expOnSecondSet]⊗circuit1q[expOnFirstSet]

            reverseGates = transpose(initialGates) # yay superoperators.
            # So at this point we have set up one of the 'new' experiments.
            additionalExperiment = []
            # Get the actual probabilities.
            @showprogress 1 "QGroup $qubitOn Exp: $experimentType: " for m in lengths
                wholeCircuit = reverseGates*noise^m*initialGates*rm*start./64 # Normalising our zs
                probs = [z'*wholeCircuit for z in zs]
                push!(additionalExperiment,probs)
            end
            push!(e2_all_actualProbabilities,additionalExperiment)
            
            # Generate the measurement statistics
            cumMatrix = map(cumsum,additionalExperiment)
            experiment2_additional_observed =  shotSimulator(64,shotsToDo,cumMatrix);
            # Fit and extract the fidelities
            (params,l, failed) = fitTheFidelities(lengths,experiment2_additional_observed)
            experiment2_additional_fidelities = vcat(1,[p[2] for p in params])
            push!(e2_all_additional_fidelities,experiment2_additional_fidelities)
            fidelitiesExtracted=[]
            for x1 in potentialSingles[expOnFourthSet] # Experiment 2, qubit division 3 (q1, q2&3, q4&5 q6) the second part = exp number
                p6 = binaryArrayToNumber(x1)
                for x2b in all2QlMuBs[expOnThirdSet]
                    p45 = binaryArrayToNumber(x2b)
                    for x2 in all2QlMuBs[expOnSecondSet]
                        p23 = binaryArrayToNumber(x2)
                        for x3 in potentialSingles[expOnFirstSet]
                            p1 = binaryArrayToNumber(x3)
                            push!(fidelitiesExtracted,p6*4^5+p45*4^3+p23*4+p1+1) # + 1 cause we index of 1 in Julia
                        end
                     end
                end
            end
            push!(e2_fidelity_extracted,fidelitiesExtracted)
        end
    end
end
QGroup 1 Exp: 1: 100%|██████████████████████████████████| Time: 0:01:16
QGroup 1 Exp: 2: 100%|██████████████████████████████████| Time: 0:01:17
QGroup 2 Exp: 1: 100%|██████████████████████████████████| Time: 0:01:17
QGroup 2 Exp: 3: 100%|██████████████████████████████████| Time: 0:01:16
QGroup 2 Exp: 4: 100%|██████████████████████████████████| Time: 0:01:17
QGroup 2 Exp: 5: 100%|██████████████████████████████████| Time: 0:01:17
QGroup 3 Exp: 2: 100%|██████████████████████████████████| Time: 0:01:18
QGroup 3 Exp: 3: 100%|██████████████████████████████████| Time: 0:01:17
QGroup 3 Exp: 4: 100%|██████████████████████████████████| Time: 0:01:16
QGroup 3 Exp: 5: 100%|██████████████████████████████████| Time: 0:01:16
QGroup 4 Exp: 1: 100%|██████████████████████████████████| Time: 0:01:17
QGroup 4 Exp: 2: 100%|██████████████████████████████████| Time: 0:01:16
Qubits: 100%|███████████████████████████████████████████| Time: 0:15:26
In [50]:
e2_all_additional_fidelities[1]
Out[50]:
64-element Array{Float64,1}:
 1.0
 0.9717523697451188
 0.9991676369755909
 0.9709244723454289
 0.9981766764813327
 0.9699284238623218
 0.9986231009644055
 0.9702564827175163
 0.9512418846043682
 0.9236218399391224
 0.9508934528804082
 0.9232719057295831
 0.9495534658317248
 ⋮
 0.9487124039115584
 0.9369045824609961
 0.9492380011931869
 0.9374605064621049
 0.982588164588106
 0.9704544251064045
 0.98176033631428
 0.9697576995707274
 0.9807633296559303
 0.968580298226486
 0.9812418357678202
 0.9690889216859961
In [51]:
# So for example, if we look experiment 8 of these additional experiments, we see
toExtract = 8

# And (just out of interest we can compare the fidelities we extracted, with the actual values)
for i in 1:64
    print("$(fidelityLabels(e2_fidelity_extracted[toExtract][i]-1,qubits=6)) ",
          "($(e2_fidelity_extracted[toExtract][i])): ",
          "\tEstimate: $(round(e2_all_additional_fidelities[toExtract][i],digits=5))",
          " <-> $(round(actualOracle[e2_fidelity_extracted[toExtract][i]],digits=5)) ",
          "\tPercentage Error: $(round.(abs(actualOracle[e2_fidelity_extracted[toExtract][i]]-e2_all_additional_fidelities[toExtract][i])/(actualOracle[e2_fidelity_extracted[toExtract][i]])*100,digits=4))%\n")
    
end
IIIIII (1): 	Estimate: 1.0 <-> 1.0 	Percentage Error: 0.0%
IIIIIZ (4): 	Estimate: 0.97211 <-> 0.972 	Percentage Error: 0.011%
IIIZYI (57): 	Estimate: 0.99915 <-> 0.9992 	Percentage Error: 0.0049%
IIIZYZ (60): 	Estimate: 0.97114 <-> 0.97122 	Percentage Error: 0.0086%
IIIXZI (29): 	Estimate: 0.99812 <-> 0.9982 	Percentage Error: 0.0085%
IIIXZZ (32): 	Estimate: 0.97031 <-> 0.97025 	Percentage Error: 0.0062%
IIIYXI (37): 	Estimate: 0.99859 <-> 0.9986 	Percentage Error: 0.0007%
IIIYXZ (40): 	Estimate: 0.97072 <-> 0.97064 	Percentage Error: 0.0081%
IIXIII (65): 	Estimate: 0.99976 <-> 0.9998 	Percentage Error: 0.0043%
IIXIIZ (68): 	Estimate: 0.97172 <-> 0.97181 	Percentage Error: 0.0092%
IIXZYI (121): 	Estimate: 0.99891 <-> 0.999 	Percentage Error: 0.0095%
IIXZYZ (124): 	Estimate: 0.97079 <-> 0.97103 	Percentage Error: 0.0241%
IIXXZI (93): 	Estimate: 0.99786 <-> 0.998 	Percentage Error: 0.0137%
IIXXZZ (96): 	Estimate: 0.97 <-> 0.97006 	Percentage Error: 0.0062%
IIXYXI (101): 	Estimate: 0.99833 <-> 0.9984 	Percentage Error: 0.0072%
IIXYXZ (104): 	Estimate: 0.97046 <-> 0.97045 	Percentage Error: 0.0015%
IXIIII (257): 	Estimate: 0.98445 <-> 0.984 	Percentage Error: 0.0457%
IXIIIZ (260): 	Estimate: 0.95649 <-> 0.95622 	Percentage Error: 0.0282%
IXIZYI (313): 	Estimate: 0.98355 <-> 0.98321 	Percentage Error: 0.0348%
IXIZYZ (316): 	Estimate: 0.95548 <-> 0.95545 	Percentage Error: 0.0031%
IXIXZI (285): 	Estimate: 0.98261 <-> 0.98222 	Percentage Error: 0.0401%
IXIXZZ (288): 	Estimate: 0.95473 <-> 0.95449 	Percentage Error: 0.0248%
IXIYXI (293): 	Estimate: 0.98299 <-> 0.98261 	Percentage Error: 0.0384%
IXIYXZ (296): 	Estimate: 0.95473 <-> 0.95487 	Percentage Error: 0.0149%
IXXIII (321): 	Estimate: 0.98407 <-> 0.9838 	Percentage Error: 0.0271%
IXXIIZ (324): 	Estimate: 0.95597 <-> 0.95603 	Percentage Error: 0.0063%
IXXZYI (377): 	Estimate: 0.98315 <-> 0.98301 	Percentage Error: 0.0145%
IXXZYZ (380): 	Estimate: 0.95497 <-> 0.95526 	Percentage Error: 0.0301%
IXXXZI (349): 	Estimate: 0.98218 <-> 0.98202 	Percentage Error: 0.0166%
IXXXZZ (352): 	Estimate: 0.95408 <-> 0.9543 	Percentage Error: 0.0222%
IXXYXI (357): 	Estimate: 0.98256 <-> 0.98241 	Percentage Error: 0.0145%
IXXYXZ (360): 	Estimate: 0.9542 <-> 0.95468 	Percentage Error: 0.0509%
ZIIIII (3073): 	Estimate: 0.99926 <-> 0.9992 	Percentage Error: 0.0056%
ZIIIIZ (3076): 	Estimate: 0.97128 <-> 0.97122 	Percentage Error: 0.0063%
ZIIZYI (3129): 	Estimate: 0.9984 <-> 0.9984 	Percentage Error: 0.0001%
ZIIZYZ (3132): 	Estimate: 0.97035 <-> 0.97045 	Percentage Error: 0.0094%
ZIIXZI (3101): 	Estimate: 0.99738 <-> 0.9974 	Percentage Error: 0.0024%
ZIIXZZ (3104): 	Estimate: 0.96949 <-> 0.96947 	Percentage Error: 0.0021%
ZIIYXI (3109): 	Estimate: 0.99785 <-> 0.9978 	Percentage Error: 0.0048%
ZIIYXZ (3112): 	Estimate: 0.96989 <-> 0.96986 	Percentage Error: 0.0025%
ZIXIII (3137): 	Estimate: 0.99901 <-> 0.999 	Percentage Error: 0.0006%
ZIXIIZ (3140): 	Estimate: 0.9709 <-> 0.97103 	Percentage Error: 0.013%
ZIXZYI (3193): 	Estimate: 0.99816 <-> 0.9982 	Percentage Error: 0.0045%
ZIXZYZ (3196): 	Estimate: 0.97004 <-> 0.97025 	Percentage Error: 0.022%
ZIXXZI (3165): 	Estimate: 0.99713 <-> 0.9972 	Percentage Error: 0.0074%
ZIXXZZ (3168): 	Estimate: 0.96922 <-> 0.96928 	Percentage Error: 0.0067%
ZIXYXI (3173): 	Estimate: 0.99758 <-> 0.9976 	Percentage Error: 0.0017%
ZIXYXZ (3176): 	Estimate: 0.96966 <-> 0.96967 	Percentage Error: 0.0013%
ZXIIII (3329): 	Estimate: 0.98362 <-> 0.98321 	Percentage Error: 0.0418%
ZXIIIZ (3332): 	Estimate: 0.95567 <-> 0.95545 	Percentage Error: 0.023%
ZXIZYI (3385): 	Estimate: 0.98272 <-> 0.98241 	Percentage Error: 0.0316%
ZXIZYZ (3388): 	Estimate: 0.95465 <-> 0.95468 	Percentage Error: 0.003%
ZXIXZI (3357): 	Estimate: 0.98183 <-> 0.98142 	Percentage Error: 0.041%
ZXIXZZ (3360): 	Estimate: 0.95389 <-> 0.95372 	Percentage Error: 0.0179%
ZXIYXI (3365): 	Estimate: 0.98219 <-> 0.98182 	Percentage Error: 0.0379%
ZXIYXZ (3368): 	Estimate: 0.95385 <-> 0.9541 	Percentage Error: 0.0267%
ZXXIII (3393): 	Estimate: 0.98325 <-> 0.98301 	Percentage Error: 0.0247%
ZXXIIZ (3396): 	Estimate: 0.95515 <-> 0.95526 	Percentage Error: 0.0116%
ZXXZYI (3449): 	Estimate: 0.98235 <-> 0.98222 	Percentage Error: 0.0138%
ZXXZYZ (3452): 	Estimate: 0.95413 <-> 0.95449 	Percentage Error: 0.038%
ZXXXZI (3421): 	Estimate: 0.98142 <-> 0.98123 	Percentage Error: 0.0197%
ZXXXZZ (3424): 	Estimate: 0.95324 <-> 0.95353 	Percentage Error: 0.0299%
ZXXYXI (3429): 	Estimate: 0.98178 <-> 0.98162 	Percentage Error: 0.0159%
ZXXYXZ (3432): 	Estimate: 0.95333 <-> 0.95391 	Percentage Error: 0.0613%
In [52]:
# Okay load them all up into the oracle

for (expNo,x) in enumerate(e2_fidelity_extracted)
    for (fidelityIndex,fidelity)  in enumerate(x)
        push!(estimateOracle[fidelity],e2_all_additional_fidelities[expNo][fidelityIndex])
    end
end
In [53]:
# our oracle so far
for i in 1:4096
    if estimateOracle[i]!=[]
        print("$(string(i,pad=3)) $(round.(actualOracle[i],digits=5)) \t$(round.(estimateOracle[i],digits=5))\n")
    end
end
001 1.0 	[1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
002 0.972 	[0.97245, 0.97175]
003 0.984 	[0.98379, 0.98434]
004 0.972 	[0.97222, 0.97185, 0.97194, 0.97223, 0.97192, 0.97228, 0.97207, 0.97211, 0.97152, 0.97182, 0.97204, 0.97236]
005 0.9914 	[0.99144, 0.99129]
006 0.97941 	[0.97957]
007 0.9914 	[0.99154]
008 0.96342 	[0.96466, 0.96398, 0.96294, 0.96325, 0.9632, 0.96397, 0.96343, 0.96334, 0.96326, 0.96409]
009 0.9918 	[0.99173, 0.99173]
010 0.9798 	[0.98004, 0.98015, 0.97996, 0.97959, 0.97987, 0.9798, 0.97983, 0.97964, 0.98009]
011 0.9918 	[0.99173]
012 0.96381 	[0.96422, 0.96408]
013 0.9992 	[0.99924, 0.99917]
014 0.97122 	[0.97099]
015 0.98321 	[0.98326, 0.98301, 0.98303, 0.98328, 0.98312, 0.98329, 0.98322, 0.98338, 0.98344]
016 0.97122 	[0.97164, 0.97164]
017 0.999 	[0.99905, 0.99899]
020 0.97103 	[0.97143]
021 0.9904 	[0.99029]
024 0.96245 	[0.9621, 0.96326]
025 0.9908 	[0.99083]
026 0.97882 	[0.97902]
028 0.96283 	[0.96294]
029 0.9982 	[0.9982, 0.99818, 0.99815, 0.99821, 0.99812, 0.99822, 0.99821, 0.9982, 0.99819]
030 0.97024 	[0.96993]
031 0.98222 	[0.98198, 0.98257]
032 0.97025 	[0.9701, 0.97018, 0.97031, 0.96956, 0.97029, 0.97036, 0.97099]
033 0.9912 	[0.99122, 0.99117]
036 0.96322 	[0.96366]
037 0.9986 	[0.99859, 0.99862, 0.99852, 0.99864, 0.99859, 0.99861, 0.99864, 0.99859, 0.99862]
038 0.97063 	[0.97026]
039 0.98261 	[0.9829]
040 0.97064 	[0.96982, 0.97056, 0.97076, 0.97072, 0.97013, 0.97058, 0.97082, 0.97129]
041 0.999 	[0.99902]
042 0.97102 	[0.97078]
044 0.97103 	[0.97108]
045 0.9904 	[0.99044]
047 0.99041 	[0.99034]
048 0.96245 	[0.96289]
049 0.9914 	[0.99143, 0.99137]
052 0.96342 	[0.96381]
053 0.9988 	[0.99885]
056 0.97083 	[0.97072, 0.97095]
057 0.9992 	[0.9992, 0.99917, 0.99916, 0.99924, 0.99915, 0.99923, 0.99924, 0.99921, 0.9992]
058 0.97122 	[0.97188, 0.97092]
059 0.98321 	[0.9836]
060 0.97122 	[0.97131, 0.97122, 0.97114, 0.97065, 0.97113, 0.9713, 0.97159]
061 0.9906 	[0.99053]
063 0.99061 	[0.99061]
064 0.96264 	[0.96339]
065 0.9998 	[0.99983, 0.99976]
068 0.97181 	[0.97172]
072 0.96322 	[0.96257]
074 0.97961 	[0.97967]
079 0.98301 	[0.98293]
081 0.9988 	[0.99889]
088 0.96225 	[0.96175]
090 0.97863 	[0.97867]
093 0.998 	[0.99786]
095 0.98202 	[0.9819]
096 0.97006 	[0.97]
097 0.991 	[0.99091]
101 0.9984 	[0.99833]
104 0.97045 	[0.97043, 0.97046]
106 0.97082 	[0.97099]
111 0.99022 	[0.99018]
113 0.9912 	[0.9911, 0.99132, 0.99138, 0.99103, 0.99118, 0.99143, 0.99122, 0.99126, 0.99144]
114 0.97922 	[0.97945]
115 0.99121 	[0.99099]
116 0.96322 	[0.96393]
117 0.9986 	[0.99861]
118 0.97063 	[0.97115]
119 0.98261 	[0.98272]
120 0.97064 	[0.97126, 0.9713, 0.97094, 0.97077, 0.97058]
121 0.999 	[0.99898, 0.99891]
122 0.97102 	[0.97163, 0.97103, 0.97088, 0.97107, 0.97178]
123 0.98301 	[0.98273]
124 0.97103 	[0.97133, 0.97079]
125 0.9904 	[0.99044]
126 0.97843 	[0.97817]
127 0.99041 	[0.99052, 0.99049, 0.99049, 0.9904, 0.99054]
128 0.96245 	[0.96317]
129 0.9994 	[0.99942, 0.99941]
132 0.97142 	[0.97059]
136 0.96283 	[0.96246]
138 0.97922 	[0.9786]
143 0.98261 	[0.98276]
145 0.9984 	[0.99839, 0.99843, 0.99843, 0.99828, 0.99835, 0.99856, 0.99835, 0.9983, 0.99838]
146 0.97043 	[0.97101]
147 0.98241 	[0.98191]
148 0.97045 	[0.97054]
149 0.9898 	[0.98999]
150 0.97785 	[0.9781]
151 0.98982 	[0.98993]
152 0.96186 	[0.96303, 0.96228, 0.96186, 0.9618, 0.96182]
153 0.9902 	[0.99001]
154 0.97824 	[0.97831, 0.97833, 0.97787, 0.97816, 0.9784]
155 0.99022 	[0.98994]
156 0.96225 	[0.96273]
157 0.9976 	[0.99757, 0.99769]
158 0.96965 	[0.96954]
159 0.98162 	[0.98164, 0.98198, 0.98148, 0.98163, 0.98207]
160 0.96967 	[0.97002, 0.96858]
161 0.9906 	[0.99076]
165 0.998 	[0.99808]
168 0.97006 	[0.96884, 0.9693]
170 0.97043 	[0.96998]
175 0.98982 	[0.98971]
177 0.9908 	[0.99058]
184 0.97025 	[0.97032]
185 0.9986 	[0.99866]
186 0.97063 	[0.97082]
188 0.97064 	[0.96972]
191 0.99002 	[0.98983]
193 0.9992 	[0.99919, 0.99922]
196 0.97122 	[0.97118]
200 0.96264 	[0.96275]
202 0.97902 	[0.97906]
207 0.98241 	[0.98238]
209 0.9982 	[0.99823]
216 0.96167 	[0.96222]
218 0.97804 	[0.9784]
221 0.9974 	[0.99739]
223 0.98142 	[0.98127]
224 0.96947 	[0.96958]
225 0.9904 	[0.99038, 0.99055, 0.99058, 0.99016, 0.99042, 0.99059, 0.99034, 0.99046, 0.99057]
226 0.97843 	[0.97859]
227 0.99041 	[0.99021]
228 0.96245 	[0.96336]
229 0.9978 	[0.99777, 0.99782]
230 0.96985 	[0.97035]
231 0.98182 	[0.98204]
232 0.96986 	[0.97036, 0.97029, 0.96985, 0.97001, 0.96985, 0.96982]
233 0.9982 	[0.99811]
234 0.97024 	[0.9708, 0.97016, 0.97003, 0.96999, 0.97078]
235 0.98222 	[0.9818]
236 0.97025 	[0.97049]
237 0.9896 	[0.98967]
238 0.97765 	[0.97746]
239 0.98962 	[0.98974, 0.98969, 0.9898, 0.98967, 0.98959]
240 0.96167 	[0.96259]
241 0.9906 	[0.99058]
248 0.97006 	[0.97013]
249 0.9984 	[0.99843]
250 0.97043 	[0.97112]
252 0.97045 	[0.97051]
255 0.98982 	[0.98974]
257 0.984 	[0.98416, 0.98445]
260 0.95622 	[0.95649]
264 0.96365 	[0.96451]
266 0.96397 	[0.96369]
271 0.98328 	[0.98343]
285 0.98222 	[0.98261]
288 0.95449 	[0.95473]
293 0.98261 	[0.98299]
296 0.95487 	[0.95473]
313 0.98321 	[0.98355]
316 0.95545 	[0.95548]
321 0.9838 	[0.98407]
324 0.95603 	[0.95597]
349 0.98202 	[0.98218]
352 0.9543 	[0.95408]
357 0.98241 	[0.98256]
360 0.95468 	[0.9542]
369 0.99121 	[0.99107]
376 0.95487 	[0.95525]
377 0.98301 	[0.98315]
378 0.97119 	[0.97133]
380 0.95526 	[0.95497]
383 0.97449 	[0.97444]
385 0.9834 	[0.98313, 0.98329, 0.98327, 0.98327, 0.98332, 0.98336, 0.98347, 0.9832, 0.98324]
386 0.97158 	[0.97107]
387 0.98347 	[0.98317]
388 0.95565 	[0.95579, 0.95631, 0.95641, 0.95696, 0.95579, 0.95409, 0.95602]
389 0.99081 	[0.99081]
392 0.96307 	[0.96389]
393 0.99121 	[0.99119]
396 0.96345 	[0.96467]
397 0.98261 	[0.98263]
400 0.95487 	[0.9554]
401 0.98241 	[0.98228, 0.98244]
404 0.95468 	[0.95573]
405 0.98982 	[0.98981]
408 0.9621 	[0.96265, 0.96318]
409 0.99022 	[0.99031]
410 0.96241 	[0.96167]
412 0.96249 	[0.96321]
413 0.98162 	[0.98128, 0.98139, 0.98159, 0.98135, 0.98154]
414 0.96983 	[0.96917]
415 0.9817 	[0.98183, 0.98139]
416 0.95391 	[0.95423, 0.95222, 0.95465]
417 0.99061 	[0.99058]
420 0.96287 	[0.96432]
421 0.98202 	[0.98175, 0.98191, 0.98193, 0.98172, 0.9819]
422 0.97022 	[0.96963]
423 0.9821 	[0.9818]
424 0.9543 	[0.9546, 0.95264, 0.9551]
425 0.98241 	[0.98247]
428 0.95468 	[0.95651]
429 0.98982 	[0.98991]
432 0.9621 	[0.96303]
433 0.99081 	[0.99094]
436 0.96307 	[0.96301]
437 0.98222 	[0.98213]
440 0.95449 	[0.95528]
441 0.98261 	[0.9824, 0.98243, 0.98264, 0.9824, 0.98253]
442 0.9708 	[0.97031]
443 0.98269 	[0.98238]
444 0.95487 	[0.95513, 0.95351, 0.95539]
445 0.99002 	[0.99005]
448 0.9623 	[0.96255]
449 0.98321 	[0.98361]
452 0.95545 	[0.95564]
477 0.98142 	[0.9817]
480 0.95372 	[0.9535]
481 0.99041 	[0.9903]
485 0.98182 	[0.98219]
488 0.9541 	[0.95443, 0.95382]
490 0.97041 	[0.97051]
495 0.9737 	[0.97375]
505 0.98241 	[0.98281]
508 0.95468 	[0.9546]
513 0.952 	[0.95181, 0.95145]
516 0.92512 	[0.92298]
520 0.93256 	[0.93235]
522 0.93261 	[0.93229]
527 0.95156 	[0.95112]
541 0.95027 	[0.94949]
544 0.92344 	[0.92113]
549 0.95066 	[0.9494]
552 0.92381 	[0.92138]
569 0.95123 	[0.95032]
572 0.92437 	[0.9221]
577 0.95181 	[0.95238]
580 0.92493 	[0.92492]
605 0.95008 	[0.95053]
608 0.92325 	[0.92298]
613 0.95046 	[0.95091]
616 0.92363 	[0.92339]
625 0.95923 	[0.95877]
632 0.92381 	[0.92436]
633 0.95104 	[0.95155]
634 0.93986 	[0.93907]
636 0.92419 	[0.92388]
639 0.9428 	[0.94314]
641 0.95142 	[0.95071]
644 0.92456 	[0.92257]
657 0.95046 	[0.95032]
664 0.93107 	[0.93085]
666 0.93111 	[0.93106]
669 0.9497 	[0.9489]
671 0.95004 	[0.94956]
672 0.92288 	[0.92086]
677 0.95008 	[0.94871]
680 0.92325 	[0.92105]
697 0.95066 	[0.94967]
700 0.92381 	[0.92168]
705 0.95123 	[0.95097, 0.95114, 0.95151, 0.95197, 0.95197, 0.95114, 0.95159, 0.95061, 0.95225]
706 0.94005 	[0.93963]
707 0.95156 	[0.95194]
708 0.92437 	[0.92376, 0.92586, 0.92511, 0.92437, 0.92485, 0.92348, 0.92578]
709 0.95866 	[0.95939]
712 0.93181 	[0.93286]
713 0.95904 	[0.95885]
716 0.93219 	[0.93177]
717 0.95046 	[0.9508]
720 0.92363 	[0.92421]
721 0.95027 	[0.95143]
724 0.92344 	[0.92471]
725 0.9577 	[0.95876]
728 0.93088 	[0.93218]
729 0.95808 	[0.95879]
732 0.93125 	[0.93287]
733 0.94951 	[0.94961, 0.9491, 0.94991, 0.9492, 0.95115]
734 0.93836 	[0.93741]
735 0.94985 	[0.95015]
736 0.9227 	[0.92213, 0.92149, 0.9251]
737 0.95846 	[0.95808, 0.95829]
740 0.93163 	[0.93164]
741 0.94989 	[0.9504, 0.94964, 0.95042, 0.94962, 0.95145]
742 0.93873 	[0.93793]
743 0.95023 	[0.95083]
744 0.92307 	[0.92348, 0.92261, 0.9221, 0.92536]
745 0.95027 	[0.95034]
746 0.93911 	[0.93804]
748 0.92344 	[0.92395]
749 0.9577 	[0.95828]
751 0.94204 	[0.94229]
752 0.93088 	[0.93273]
753 0.95866 	[0.95877]
756 0.93181 	[0.9322]
757 0.95008 	[0.95081]
760 0.92325 	[0.9247]
761 0.95046 	[0.9507, 0.95071, 0.95088, 0.95017, 0.95167]
762 0.9393 	[0.93919]
763 0.9508 	[0.9516]
764 0.92363 	[0.92358, 0.92281, 0.92518]
765 0.95789 	[0.95803]
768 0.93107 	[0.93162]
769 0.952 	[0.95314, 0.95203]
772 0.92534 	[0.92457]
776 0.91679 	[0.91672]
778 0.93277 	[0.9337]
783 0.93563 	[0.93591]
797 0.95029 	[0.9506]
800 0.92368 	[0.92338]
805 0.95067 	[0.95108]
808 0.92405 	[0.92383]
825 0.95124 	[0.95148]
828 0.9246 	[0.92427]
833 0.95181 	[0.95178, 0.95124, 0.95247, 0.9528, 0.95211, 0.95176, 0.95215, 0.95142, 0.95287]
834 0.92477 	[0.92362]
835 0.9362 	[0.93736]
836 0.92516 	[0.92457, 0.92722, 0.92628, 0.926, 0.92566, 0.92445, 0.92598]
837 0.94324 	[0.94421]
840 0.9166 	[0.91822]
841 0.94362 	[0.94378]
844 0.91697 	[0.91825]
845 0.95105 	[0.95115]
848 0.92442 	[0.92497]
849 0.95086 	[0.95173]
852 0.92423 	[0.92595]
853 0.94229 	[0.9439]
856 0.91568 	[0.91808]
857 0.94267 	[0.94341]
860 0.91605 	[0.91763]
861 0.9501 	[0.95015, 0.94955, 0.95046, 0.94985, 0.95136]
862 0.9231 	[0.92226]
863 0.9345 	[0.93569]
864 0.92349 	[0.92285, 0.9227, 0.92533]
865 0.94305 	[0.94318]
868 0.91642 	[0.91805]
869 0.95048 	[0.95096, 0.95013, 0.95097, 0.95024, 0.95135]
870 0.92347 	[0.9226]
871 0.93487 	[0.93612]
872 0.92386 	[0.92342, 0.92331, 0.92517]
873 0.95086 	[0.95105]
876 0.92423 	[0.92586]
877 0.94229 	[0.94311]
880 0.91568 	[0.91768]
881 0.94324 	[0.94355, 0.94298]
884 0.9166 	[0.91645]
885 0.95067 	[0.95199]
888 0.92405 	[0.9236, 0.92641]
889 0.95105 	[0.95126, 0.95089, 0.95194, 0.95104, 0.95206]
890 0.92403 	[0.92427, 0.92327]
891 0.93544 	[0.93674]
892 0.92442 	[0.92426, 0.92422, 0.92522]
893 0.94248 	[0.94212]
895 0.94287 	[0.94297]
896 0.91586 	[0.91589]
897 0.95143 	[0.95155]
900 0.92479 	[0.92352]
913 0.95048 	[0.95116]
920 0.91531 	[0.91568]
922 0.93128 	[0.9319]
925 0.94972 	[0.94999]
927 0.93412 	[0.93411]
928 0.92312 	[0.92169]
933 0.9501 	[0.95036]
936 0.92349 	[0.92212]
953 0.95067 	[0.95086]
956 0.92405 	[0.92276]
961 0.95124 	[0.95116]
964 0.9246 	[0.92459]
989 0.94953 	[0.94989]
992 0.92294 	[0.92334]
993 0.94248 	[0.94322]
997 0.94991 	[0.95032]
1000 0.92331 	[0.92388, 0.92401]
1002 0.92328 	[0.92456]
1007 0.94212 	[0.943]
1017 0.95048 	[0.95059]
1020 0.92386 	[0.92422]
1025 0.9998 	[0.99981, 0.99979]
1028 0.97181 	[0.97185]
1032 0.96322 	[0.96318]
1034 0.97961 	[0.97964]
1039 0.98301 	[0.98303]
1053 0.998 	[0.998]
1056 0.97006 	[0.97017]
1061 0.9984 	[0.99837]
1064 0.97045 	[0.97063]
1081 0.999 	[0.999]
1084 0.97103 	[0.97109]
1137 0.991 	[0.99103]
1144 0.97045 	[0.97076]
1146 0.97082 	[0.97067]
1151 0.99022 	[0.99032]
1169 0.9982 	[0.99815]
1176 0.96167 	[0.96161]
1178 0.97804 	[0.97769]
1183 0.98142 	[0.98127]
1249 0.9902 	[0.99013]
1256 0.96967 	[0.96964]
1258 0.97004 	[0.96975]
1263 0.98942 	[0.98961]
1281 0.9838 	[0.98392]
1288 0.96345 	[0.96429]
1290 0.96377 	[0.96347]
1295 0.98308 	[0.98325]
1393 0.99101 	[0.99088]
1400 0.95468 	[0.95503]
1402 0.97099 	[0.97113]
1407 0.97429 	[0.97416]
1409 0.98321 	[0.98296]
1412 0.95545 	[0.95398]
1425 0.98222 	[0.98206]
1432 0.96191 	[0.96242]
1434 0.96222 	[0.96148]
1437 0.98142 	[0.98112]
1439 0.98151 	[0.98164]
1440 0.95372 	[0.95209]
1445 0.98182 	[0.98148]
1448 0.9541 	[0.95255]
1465 0.98241 	[0.98217]
1468 0.95468 	[0.95336]
1505 0.99022 	[0.99009]
1512 0.95391 	[0.95415]
1514 0.97022 	[0.97026]
1519 0.97351 	[0.97342]
1537 0.95181 	[0.95153]
1544 0.93237 	[0.93249]
1546 0.93242 	[0.93138]
1551 0.95137 	[0.9516]
1649 0.95904 	[0.95917]
1656 0.92363 	[0.92448]
1658 0.93967 	[0.93918]
1663 0.94261 	[0.94289]
1681 0.95027 	[0.94976]
1688 0.93088 	[0.93198]
1690 0.93092 	[0.93027]
1695 0.94985 	[0.9503]
1729 0.95104 	[0.95037]
1732 0.92419 	[0.92326]
1757 0.94931 	[0.94896]
1760 0.92251 	[0.92119]
1761 0.95827 	[0.95832]
1765 0.9497 	[0.94939]
1768 0.92288 	[0.92245, 0.92182]
1770 0.93892 	[0.93812]
1775 0.94185 	[0.94178]
1785 0.95027 	[0.9499]
1788 0.92344 	[0.92255]
1793 0.95181 	[0.95238, 0.95156, 0.95179, 0.95135, 0.95203, 0.9521, 0.9519, 0.95136, 0.95258]
1794 0.92477 	[0.92446]
1795 0.9362 	[0.93528]
1796 0.92516 	[0.92545]
1797 0.94324 	[0.94294]
1798 0.93221 	[0.93232]
1799 0.94363 	[0.94354]
1800 0.9166 	[0.91648, 0.91646, 0.91656, 0.91784, 0.9183]
1801 0.94362 	[0.94319]
1802 0.93259 	[0.93246, 0.93224, 0.93296, 0.93349, 0.93417]
1803 0.94401 	[0.94303]
1804 0.91697 	[0.91792]
1805 0.95105 	[0.95141]
1806 0.92403 	[0.92447]
1807 0.93544 	[0.9354, 0.93484, 0.93514, 0.93512, 0.93569]
1808 0.92442 	[0.9247]
1809 0.95086 	[0.951]
1816 0.91568 	[0.91548]
1818 0.93165 	[0.93214]
1823 0.9345 	[0.93388]
1825 0.94305 	[0.94298]
1832 0.92386 	[0.92431]
1834 0.92384 	[0.92486]
1839 0.94269 	[0.94205]
1841 0.94324 	[0.94415]
1848 0.92405 	[0.92538]
1850 0.92403 	[0.92629]
1855 0.94287 	[0.94276]
1857 0.95162 	[0.9515, 0.95114]
1860 0.92497 	[0.92429]
1864 0.91642 	[0.91693]
1866 0.9324 	[0.93311]
1871 0.93525 	[0.93491]
1873 0.95067 	[0.95053]
1880 0.91549 	[0.91576]
1882 0.93147 	[0.93215]
1885 0.94991 	[0.94955]
1887 0.93431 	[0.9336]
1888 0.92331 	[0.92249]
1889 0.94286 	[0.94235]
1893 0.95029 	[0.94992]
1896 0.92368 	[0.92356, 0.92312]
1898 0.92365 	[0.92281]
1903 0.9425 	[0.94212]
1905 0.94305 	[0.94339, 0.94273, 0.9423, 0.9423, 0.94314]
1906 0.93203 	[0.93152]
1907 0.94344 	[0.94238]
1908 0.91642 	[0.91613]
1909 0.95048 	[0.95032]
1910 0.92347 	[0.92295]
1911 0.93487 	[0.93504]
1912 0.92386 	[0.92416]
1913 0.95086 	[0.95001, 0.95076]
1914 0.92384 	[0.92337]
1915 0.93525 	[0.93367]
1916 0.92423 	[0.92525, 0.92403]
1917 0.94229 	[0.9425]
1918 0.93128 	[0.93184]
1919 0.94269 	[0.94333]
1920 0.91568 	[0.91546]
1921 0.95124 	[0.95061]
1928 0.91605 	[0.91689]
1930 0.93203 	[0.93251]
1935 0.93487 	[0.93434]
1937 0.95029 	[0.9507, 0.95027, 0.94995, 0.94922, 0.94962]
1938 0.92328 	[0.92244]
1939 0.93468 	[0.93311]
1940 0.92368 	[0.92291]
1941 0.94172 	[0.94114]
1942 0.93072 	[0.93039]
1943 0.94212 	[0.94191]
1944 0.91513 	[0.9153]
1945 0.9421 	[0.94118]
1946 0.9311 	[0.9307]
1947 0.9425 	[0.94089]
1948 0.91549 	[0.91673]
1949 0.94953 	[0.94918]
1950 0.92254 	[0.92334]
1951 0.93393 	[0.93375]
1952 0.92294 	[0.92233]
1953 0.94248 	[0.94202]
1960 0.92331 	[0.92313]
1962 0.92328 	[0.92369]
1967 0.94212 	[0.94103]
1969 0.94267 	[0.94262]
1976 0.92349 	[0.92362]
1978 0.92347 	[0.92327]
1983 0.94231 	[0.94203]
1985 0.95105 	[0.95136]
1992 0.91586 	[0.9177]
1994 0.93184 	[0.93347]
1999 0.93469 	[0.93516]
2001 0.9501 	[0.95018]
2008 0.91494 	[0.91412]
2010 0.93091 	[0.93001]
2015 0.93374 	[0.93245]
2017 0.94229 	[0.94288, 0.94269, 0.94172, 0.94149, 0.94266]
2018 0.93128 	[0.93103]
2019 0.94269 	[0.94167]
2020 0.91568 	[0.91585]
2021 0.94972 	[0.94985]
2022 0.92272 	[0.92263]
2023 0.93412 	[0.93443]
2024 0.92313 	[0.92311]
2025 0.9501 	[0.94919]
2026 0.9231 	[0.92261]
2027 0.9345 	[0.93292]
2028 0.92349 	[0.92483]
2029 0.94153 	[0.942]
2030 0.93054 	[0.93122]
2031 0.94193 	[0.94241]
2032 0.91494 	[0.91519]
2033 0.94248 	[0.94316]
2040 0.92331 	[0.92479]
2042 0.92328 	[0.92505]
2047 0.94212 	[0.94181]
2049 0.9994 	[0.99943, 0.99938]
2052 0.97142 	[0.97184]
2056 0.96283 	[0.96281]
2058 0.97922 	[0.97906]
2063 0.98261 	[0.98284]
2077 0.9976 	[0.99757]
2080 0.96967 	[0.97049]
2085 0.998 	[0.99798]
2088 0.97006 	[0.97073]
2105 0.9986 	[0.99858]
2108 0.97064 	[0.9711]
2161 0.9906 	[0.99071]
2168 0.97006 	[0.97017]
2170 0.97043 	[0.97057]
2175 0.98982 	[0.98989]
2193 0.9978 	[0.99772]
2200 0.96128 	[0.96122]
2202 0.97765 	[0.97763]
2207 0.98103 	[0.98104]
2273 0.9898 	[0.98994]
2280 0.96928 	[0.96948]
2282 0.96965 	[0.96948]
2287 0.98903 	[0.98914]
2305 0.9834 	[0.98339, 0.98366, 0.98343, 0.98324, 0.98356, 0.98321, 0.98309, 0.98378, 0.98364]
2306 0.97158 	[0.9719]
2307 0.98347 	[0.98351]
2308 0.95565 	[0.95684]
2309 0.99081 	[0.9906]
2310 0.96299 	[0.96299]
2311 0.97488 	[0.97548]
2312 0.96307 	[0.96325, 0.96355, 0.9629, 0.96307, 0.96264]
2313 0.99121 	[0.99128]
2314 0.96338 	[0.96376, 0.96409, 0.96233, 0.9639, 0.96396]
2315 0.97528 	[0.97523]
2316 0.96345 	[0.96479]
2317 0.98261 	[0.98285]
2318 0.9708 	[0.97093]
2319 0.98269 	[0.9828, 0.98256, 0.98264, 0.98297, 0.98268]
2320 0.95487 	[0.95619]
2321 0.98241 	[0.98209]
2328 0.9621 	[0.96197]
2330 0.96241 	[0.96132]
2335 0.9817 	[0.98188]
2337 0.99061 	[0.99079]
2344 0.9543 	[0.9554]
2346 0.97061 	[0.97088]
2351 0.9739 	[0.9742]
2353 0.99081 	[0.9909]
2360 0.95449 	[0.95424]
2362 0.9708 	[0.97116]
2367 0.9741 	[0.97387]
2369 0.98321 	[0.98292]
2376 0.96287 	[0.96273]
2378 0.96319 	[0.96212]
2383 0.98249 	[0.98251]
2385 0.98222 	[0.98198]
2392 0.96191 	[0.96177]
2394 0.96222 	[0.96108]
2399 0.98151 	[0.98182]
2401 0.99041 	[0.99034]
2408 0.9541 	[0.95492]
2410 0.97041 	[0.97084]
2415 0.9737 	[0.9737]
2417 0.99061 	[0.99064, 0.99078, 0.99049, 0.99059, 0.99054]
2418 0.9628 	[0.96333]
2419 0.97469 	[0.97434]
2420 0.96287 	[0.96322]
2421 0.98202 	[0.98196]
2422 0.97022 	[0.97052]
2423 0.9821 	[0.98276]
2424 0.9543 	[0.95555]
2425 0.98241 	[0.98215]
2426 0.97061 	[0.97074]
2427 0.98249 	[0.98255]
2428 0.95468 	[0.95596]
2429 0.98982 	[0.98981]
2430 0.96202 	[0.96254]
2431 0.9739 	[0.97363]
2432 0.9621 	[0.96279]
2433 0.98281 	[0.98317, 0.98257]
2436 0.95507 	[0.95549]
2440 0.96249 	[0.96218]
2442 0.9628 	[0.96306]
2447 0.9821 	[0.98227]
2449 0.98182 	[0.98173, 0.98214, 0.98201, 0.98149, 0.98193]
2450 0.97002 	[0.97049]
2451 0.9819 	[0.98197]
2452 0.9541 	[0.95531]
2453 0.98923 	[0.98912]
2454 0.96144 	[0.9618]
2455 0.97331 	[0.97409]
2456 0.96153 	[0.9616]
2457 0.98962 	[0.98958]
2458 0.96183 	[0.96159]
2459 0.9737 	[0.97363]
2460 0.96191 	[0.96384]
2461 0.98103 	[0.98125, 0.9809]
2462 0.96925 	[0.96961]
2463 0.98112 	[0.98115]
2464 0.95333 	[0.95474, 0.95415]
2465 0.99002 	[0.99015]
2469 0.98142 	[0.98122]
2472 0.95372 	[0.95451, 0.95459]
2474 0.97002 	[0.9698]
2479 0.97331 	[0.97335]
2481 0.99022 	[0.98989]
2488 0.95391 	[0.95494]
2489 0.98202 	[0.98187]
2490 0.97022 	[0.97082]
2492 0.9543 	[0.95488]
2495 0.97351 	[0.97329]
2497 0.98261 	[0.98281]
2504 0.9623 	[0.9618]
2506 0.96261 	[0.96332]
2511 0.9819 	[0.98171]
2513 0.98162 	[0.98141]
2520 0.96133 	[0.96158]
2522 0.96163 	[0.96225]
2527 0.98092 	[0.98075]
2529 0.98982 	[0.98988, 0.98995, 0.98975, 0.9898, 0.98975]
2530 0.96202 	[0.96235]
2531 0.9739 	[0.97386]
2532 0.9621 	[0.9629]
2533 0.98123 	[0.98121]
2534 0.96944 	[0.96963]
2535 0.98131 	[0.98193]
2536 0.95353 	[0.95451]
2537 0.98162 	[0.98138]
2538 0.96983 	[0.96987]
2539 0.9817 	[0.98195]
2540 0.95391 	[0.95514]
2541 0.98903 	[0.98907]
2542 0.96125 	[0.96167]
2543 0.97312 	[0.97294]
2544 0.96133 	[0.96237]
2545 0.99002 	[0.99006]
2552 0.95372 	[0.95373]
2554 0.97002 	[0.97015]
2559 0.97331 	[0.97296]
2561 0.95142 	[0.95128]
2568 0.932 	[0.93179]
2570 0.93205 	[0.93218]
2575 0.95099 	[0.9505]
2673 0.95866 	[0.95806]
2680 0.92326 	[0.9237]
2682 0.9393 	[0.93844]
2687 0.94223 	[0.94256]
2705 0.94989 	[0.9496]
2712 0.93051 	[0.93019]
2714 0.93054 	[0.93041]
2719 0.94947 	[0.94886]
2753 0.95066 	[0.95175]
2756 0.92381 	[0.92554]
2781 0.94893 	[0.95066]
2784 0.92214 	[0.9249]
2785 0.95789 	[0.95747]
2789 0.94931 	[0.95097]
2792 0.92251 	[0.92287, 0.92521]
2794 0.93854 	[0.93755]
2799 0.94147 	[0.94172]
2809 0.94989 	[0.95116]
2812 0.92307 	[0.92498]
2817 0.95143 	[0.95153]
2824 0.91623 	[0.91568]
2826 0.93221 	[0.93108]
2831 0.93506 	[0.93468]
2881 0.95124 	[0.95232]
2884 0.9246 	[0.92575]
2909 0.94953 	[0.95077]
2912 0.92294 	[0.92516]
2917 0.94991 	[0.95076]
2920 0.92331 	[0.92505]
2929 0.94267 	[0.9423]
2936 0.92349 	[0.92419]
2937 0.95048 	[0.95148]
2938 0.92347 	[0.92289]
2940 0.92386 	[0.92504]
2943 0.94231 	[0.94234]
2961 0.94991 	[0.95035]
2968 0.91476 	[0.91538]
2970 0.93072 	[0.93034]
2975 0.93355 	[0.93381]
3041 0.94191 	[0.94125]
3048 0.92276 	[0.92271]
3050 0.92272 	[0.9217]
3055 0.94155 	[0.94166]
3073 0.9992 	[0.99922, 0.99919, 0.99923, 0.99924, 0.99919, 0.99918, 0.99918, 0.99924, 0.99918, 0.99926, 0.9992, 0.99917]
3074 0.97122 	[0.97113]
3075 0.98321 	[0.98355]
3076 0.97122 	[0.97105, 0.97112, 0.97148, 0.97114, 0.97145, 0.9714, 0.97128, 0.9709, 0.9712]
3077 0.9906 	[0.99054]
3080 0.96264 	[0.96244, 0.96334]
3081 0.991 	[0.99093]
3082 0.97902 	[0.97929]
3084 0.96303 	[0.96346]
3085 0.9984 	[0.99841]
3087 0.98241 	[0.98264]
3088 0.97045 	[0.97077]
3089 0.9982 	[0.99817]
3092 0.97025 	[0.9707]
3093 0.9896 	[0.98956]
3096 0.96167 	[0.96257]
3097 0.99 	[0.99001]
3100 0.96206 	[0.9619]
3101 0.9974 	[0.99738, 0.99742, 0.99736, 0.9974, 0.99738, 0.99741, 0.99737]
3102 0.96945 	[0.96936]
3103 0.98142 	[0.98177]
3104 0.96947 	[0.96932, 0.96952, 0.96949, 0.96893, 0.96962]
3105 0.9904 	[0.99036]
3108 0.96245 	[0.96308]
3109 0.9978 	[0.99778, 0.99786, 0.99772, 0.99782, 0.99785, 0.99781, 0.99777]
3110 0.96985 	[0.96968]
3111 0.98182 	[0.98209]
3112 0.96986 	[0.96972, 0.97003, 0.96989, 0.96954, 0.96994]
3113 0.9982 	[0.99823]
3116 0.97025 	[0.97031]
3117 0.9896 	[0.98963]
3120 0.96167 	[0.96191]
3121 0.9906 	[0.99059]
3124 0.96264 	[0.9629]
3125 0.998 	[0.99803]
3128 0.97006 	[0.97011]
3129 0.9984 	[0.99839, 0.9984, 0.99838, 0.99842, 0.9984, 0.99843, 0.99841]
3130 0.97043 	[0.97035]
3131 0.98241 	[0.98284]
3132 0.97045 	[0.9705, 0.97054, 0.97035, 0.97005, 0.97053]
3133 0.9898 	[0.98976]
3136 0.96186 	[0.96248]
3137 0.999 	[0.99901]
3140 0.97103 	[0.9709]
3165 0.9972 	[0.99713]
3168 0.96928 	[0.96922]
3173 0.9976 	[0.99758]
3176 0.96967 	[0.96966]
3185 0.9904 	[0.99059]
3192 0.96986 	[0.96985]
3193 0.9982 	[0.99816]
3194 0.97024 	[0.971]
3196 0.97025 	[0.97004]
3199 0.98962 	[0.98978]
3201 0.9986 	[0.99859]
3204 0.97064 	[0.96999]
3217 0.9976 	[0.99761]
3224 0.96109 	[0.96122]
3226 0.97745 	[0.97779]
3229 0.9968 	[0.99686]
3231 0.98083 	[0.98131]
3232 0.96889 	[0.96795]
3237 0.9972 	[0.99726]
3240 0.96928 	[0.9687]
3257 0.9978 	[0.99785]
3260 0.96986 	[0.96915]
3265 0.9984 	[0.99839]
3268 0.97045 	[0.97058]
3293 0.9966 	[0.99655]
3296 0.9687 	[0.96895]
3297 0.9896 	[0.98974]
3301 0.997 	[0.99696]
3304 0.96909 	[0.96902, 0.96919]
3306 0.96945 	[0.96998]
3311 0.98883 	[0.98882]
3321 0.9976 	[0.99759]
3324 0.96967 	[0.96994]
3329 0.98321 	[0.98341, 0.98362]
3332 0.95545 	[0.95567]
3336 0.96287 	[0.96342]
3338 0.96319 	[0.96341]
3343 0.98249 	[0.98281]
3357 0.98142 	[0.98183]
3360 0.95372 	[0.95389]
3365 0.98182 	[0.98219]
3368 0.9541 	[0.95385]
3385 0.98241 	[0.98272]
3388 0.95468 	[0.95465]
3393 0.98301 	[0.98325]
3396 0.95526 	[0.95515]
3421 0.98123 	[0.98142]
3424 0.95353 	[0.95324]
3429 0.98162 	[0.98178]
3432 0.95391 	[0.95333]
3441 0.99041 	[0.99088]
3448 0.9541 	[0.9553]
3449 0.98222 	[0.98235]
3450 0.97041 	[0.97053]
3452 0.95449 	[0.95413]
3455 0.9737 	[0.97418]
3457 0.98261 	[0.98223, 0.98259, 0.98257, 0.9825, 0.98267, 0.98254, 0.98272]
3458 0.9708 	[0.97045]
3459 0.98269 	[0.98245]
3460 0.95487 	[0.95485, 0.95517, 0.95556, 0.9564, 0.95468]
3461 0.99002 	[0.99013]
3464 0.9623 	[0.96319]
3465 0.99041 	[0.99037]
3468 0.96268 	[0.96406]
3469 0.98182 	[0.98192]
3472 0.9541 	[0.95429]
3473 0.98162 	[0.98196, 0.9818]
3476 0.95391 	[0.95495]
3477 0.98903 	[0.98911]
3480 0.96133 	[0.96198, 0.96247]
3481 0.98942 	[0.98958]
3482 0.96163 	[0.96195]
3484 0.96172 	[0.96217]
3485 0.98083 	[0.98034, 0.98076, 0.98089]
3486 0.96905 	[0.96858]
3487 0.98092 	[0.98138, 0.9807]
3488 0.95314 	[0.95332]
3489 0.98982 	[0.98977]
3492 0.9621 	[0.96372]
3493 0.98123 	[0.98083, 0.98124, 0.98124]
3494 0.96944 	[0.96909]
3495 0.98131 	[0.98107]
3496 0.95353 	[0.9537]
3497 0.98162 	[0.98167]
3500 0.95391 	[0.95599]
3501 0.98903 	[0.98918]
3504 0.96133 	[0.962]
3505 0.99002 	[0.99023]
3508 0.9623 	[0.96211]
3509 0.98142 	[0.98133]
3512 0.95372 	[0.95416]
3513 0.98182 	[0.98151, 0.98176, 0.98192]
3514 0.97002 	[0.96976]
3515 0.9819 	[0.98166]
3516 0.9541 	[0.95419]
3517 0.98923 	[0.98937]
3520 0.96153 	[0.9616]
3521 0.98241 	[0.98285]
3524 0.95468 	[0.95489]
3549 0.98063 	[0.98093]
3552 0.95295 	[0.95267]
3553 0.98962 	[0.98999]
3557 0.98103 	[0.98139]
3560 0.95333 	[0.95393, 0.95296]
3562 0.96963 	[0.96981]
3567 0.97292 	[0.97362]
3577 0.98162 	[0.98205]
3580 0.95391 	[0.9538]
3585 0.95123 	[0.95148, 0.95154, 0.95103, 0.95167, 0.95158, 0.95104, 0.95147, 0.95099, 0.95201, 0.95059]
3586 0.94005 	[0.94102]
3587 0.95156 	[0.95162]
3588 0.92437 	[0.92421, 0.92235]
3589 0.95866 	[0.95905]
3590 0.93148 	[0.93216]
3591 0.94299 	[0.94381]
3592 0.93181 	[0.93264, 0.93092, 0.9328, 0.93195, 0.93312]
3593 0.95904 	[0.95941]
3594 0.93186 	[0.93225, 0.9307, 0.93205, 0.93236, 0.9337]
3595 0.94337 	[0.94345]
3596 0.93219 	[0.9332]
3597 0.95046 	[0.95082]
3598 0.9393 	[0.93948]
3599 0.9508 	[0.95153, 0.9505, 0.95127, 0.95043, 0.95112]
3600 0.92363 	[0.92336]
3601 0.95027 	[0.9506]
3608 0.93088 	[0.93246]
3610 0.93092 	[0.93133]
3613 0.94951 	[0.94862]
3615 0.94985 	[0.95029]
3616 0.9227 	[0.9205]
3617 0.95846 	[0.95802]
3621 0.94989 	[0.94856]
3624 0.92307 	[0.9233, 0.92073]
3626 0.93911 	[0.93874]
3631 0.94204 	[0.94163]
3633 0.95866 	[0.95912]
3640 0.92325 	[0.92468]
3641 0.95046 	[0.94951]
3642 0.9393 	[0.94051]
3644 0.92363 	[0.9215]
3647 0.94223 	[0.94247]
3649 0.95104 	[0.95115, 0.9515]
3652 0.92419 	[0.92404]
3656 0.93163 	[0.93291]
3658 0.93167 	[0.93232]
3663 0.95061 	[0.95112]
3665 0.95008 	[0.95025]
3672 0.9307 	[0.93246]
3674 0.93073 	[0.93159]
3677 0.94931 	[0.94961]
3679 0.94966 	[0.95008]
3680 0.92251 	[0.92203]
3681 0.95827 	[0.95843]
3685 0.9497 	[0.94998]
3688 0.92288 	[0.92209, 0.92246]
3690 0.93892 	[0.93801]
3695 0.94185 	[0.9408]
3697 0.95846 	[0.95861, 0.95866, 0.95819, 0.95868, 0.95778]
3698 0.9313 	[0.93153]
3699 0.9428 	[0.94284]
3700 0.93163 	[0.93041]
3701 0.94989 	[0.94942]
3702 0.93873 	[0.93963]
3703 0.95023 	[0.95076]
3704 0.92307 	[0.92418]
3705 0.95027 	[0.95084, 0.95068]
3706 0.93911 	[0.93923]
3707 0.95061 	[0.95025]
3708 0.92344 	[0.92447, 0.92303]
3709 0.9577 	[0.95712]
3710 0.93054 	[0.93136]
3711 0.94204 	[0.94259]
3712 0.93088 	[0.92969]
3713 0.95066 	[0.95036, 0.94986]
3716 0.92381 	[0.9219]
3720 0.93125 	[0.9314]
3722 0.9313 	[0.9316]
3727 0.95023 	[0.94969]
3729 0.9497 	[0.94999, 0.95036, 0.94905, 0.9499, 0.94947]
3730 0.93854 	[0.93912]
3731 0.95004 	[0.94955]
3732 0.92288 	[0.92197]
3733 0.95712 	[0.95731]
3734 0.92998 	[0.9303]
3735 0.94147 	[0.94225]
3736 0.93032 	[0.93107]
3737 0.95751 	[0.95747]
3738 0.93036 	[0.93043]
3739 0.94185 	[0.9417]
3740 0.9307 	[0.93185]
3741 0.94893 	[0.94881, 0.94804]
3742 0.93779 	[0.93844]
3743 0.94928 	[0.95003]
3744 0.92214 	[0.9213, 0.92018]
3745 0.95789 	[0.95722]
3749 0.94931 	[0.94786]
3752 0.92251 	[0.92265, 0.92034]
3754 0.93854 	[0.93761]
3759 0.94147 	[0.94083]
3761 0.95808 	[0.95825]
3768 0.9227 	[0.92199]
3769 0.94989 	[0.94885]
3770 0.93873 	[0.93805]
3772 0.92307 	[0.92103]
3775 0.94166 	[0.94085]
3777 0.95046 	[0.95106, 0.95019, 0.95063, 0.95079, 0.95102, 0.9513, 0.95026, 0.95076]
3778 0.9393 	[0.93904]
3779 0.9508 	[0.9511]
3780 0.92363 	[0.92298, 0.92503, 0.92414, 0.92375, 0.92395]
3781 0.95789 	[0.95881]
3784 0.93107 	[0.93288, 0.93192]
3785 0.95827 	[0.95791]
3786 0.93111 	[0.9328]
3788 0.93144 	[0.93111]
3789 0.9497 	[0.94994]
3791 0.95004 	[0.95002]
3792 0.92288 	[0.92334]
3793 0.94951 	[0.94881, 0.95078]
3796 0.9227 	[0.92378]
3797 0.95693 	[0.95816]
3800 0.93014 	[0.92922, 0.93121]
3801 0.95731 	[0.95795]
3802 0.93017 	[0.92872]
3804 0.93051 	[0.93204]
3805 0.94874 	[0.94885, 0.94871, 0.94919]
3806 0.93761 	[0.9369]
3807 0.94909 	[0.94873, 0.94936]
3808 0.92195 	[0.92138]
3809 0.9577 	[0.95791, 0.95816, 0.95778, 0.95781, 0.95766, 0.9573]
3810 0.93054 	[0.93104]
3811 0.94204 	[0.94193]
3812 0.93088 	[0.93056, 0.931]
3813 0.94912 	[0.9493, 0.94971, 0.94924, 0.94972]
3814 0.93798 	[0.93955, 0.93746]
3815 0.94947 	[0.95, 0.95004]
3816 0.92232 	[0.92291, 0.9219]
3817 0.94951 	[0.94962, 0.9495]
3818 0.93836 	[0.93853]
3819 0.94985 	[0.94959]
3820 0.9227 	[0.92437, 0.9234]
3821 0.95693 	[0.95713, 0.95752]
3822 0.92979 	[0.93113]
3823 0.94128 	[0.9418]
3824 0.93014 	[0.92989, 0.93195]
3825 0.95789 	[0.95791, 0.95812]
3828 0.93107 	[0.93142]
3829 0.94931 	[0.94988]
3832 0.92251 	[0.92411, 0.92389]
3833 0.9497 	[0.94993, 0.95018, 0.95017]
3834 0.93854 	[0.93945, 0.93861]
3835 0.95004 	[0.95079]
3836 0.92288 	[0.92286]
3837 0.95712 	[0.95735]
3839 0.94147 	[0.94162]
3840 0.93032 	[0.93085]
3841 0.95124 	[0.95228, 0.95137]
3844 0.9246 	[0.92402]
3848 0.91605 	[0.91584]
3850 0.93203 	[0.93288]
3855 0.93487 	[0.9351]
3869 0.94953 	[0.94987]
3872 0.92294 	[0.92279]
3877 0.94991 	[0.95037]
3880 0.92331 	[0.92315]
3897 0.95048 	[0.95084]
3900 0.92386 	[0.92364]
3905 0.95105 	[0.95086, 0.95068, 0.95182, 0.95203, 0.95155, 0.95093, 0.95128]
3906 0.92403 	[0.92317]
3907 0.93544 	[0.93669]
3908 0.92442 	[0.92359, 0.92645, 0.92547, 0.92545, 0.92466]
3909 0.94248 	[0.94348]
3912 0.91586 	[0.91725]
3913 0.94286 	[0.94295]
3916 0.91623 	[0.91769]
3917 0.95029 	[0.95029]
3920 0.92368 	[0.924]
3921 0.9501 	[0.95119]
3924 0.92349 	[0.92516]
3925 0.94153 	[0.94325]
3928 0.91494 	[0.91712]
3929 0.94191 	[0.94259]
3932 0.91531 	[0.91701]
3933 0.94934 	[0.9493, 0.94906, 0.94978]
3934 0.92235 	[0.92189]
3935 0.93374 	[0.93505]
3936 0.92276 	[0.92191]
3937 0.94229 	[0.9424]
3940 0.91568 	[0.91753]
3941 0.94972 	[0.95013, 0.94966, 0.95028]
3942 0.92272 	[0.92221]
3943 0.93412 	[0.93549]
3944 0.92313 	[0.92249]
3945 0.9501 	[0.95019]
3948 0.92349 	[0.92539]
3949 0.94153 	[0.94236]
3952 0.91494 	[0.91712]
3953 0.94248 	[0.94266, 0.94211]
3956 0.91586 	[0.91541]
3957 0.94991 	[0.95116]
3960 0.92331 	[0.92275, 0.92566]
3961 0.95029 	[0.95035, 0.95035, 0.95131]
3962 0.92328 	[0.92347, 0.92282]
3963 0.93469 	[0.93607]
3964 0.92368 	[0.92336]
3965 0.94172 	[0.94125]
3967 0.94212 	[0.94212]
3968 0.91513 	[0.91485]
3969 0.95067 	[0.95066]
3972 0.92405 	[0.92267]
3985 0.94972 	[0.95038]
3992 0.91457 	[0.91486]
3994 0.93054 	[0.93119]
3997 0.94896 	[0.94918]
3999 0.93337 	[0.93337]
4000 0.92239 	[0.92086]
4005 0.94934 	[0.94955]
4008 0.92276 	[0.92131]
4025 0.94991 	[0.94997]
4028 0.92331 	[0.92194]
4033 0.95048 	[0.95048]
4036 0.92386 	[0.92393]
4061 0.94877 	[0.94914]
4064 0.9222 	[0.92265]
4065 0.94172 	[0.94233]
4069 0.94915 	[0.94962]
4072 0.92257 	[0.92296, 0.92321]
4074 0.92254 	[0.92375]
4079 0.94137 	[0.94211]
4089 0.94972 	[0.94995]
4092 0.92313 	[0.92349]
In [54]:
using Statistics
# So we can get an idea of the variance in the oracle (where we have fidelities)
# This is based off 1,000 shots per 'experiment sequence'
for (ix,i) in enumerate(estimateOracle)
    if length(i) > 1 
        print("$ix: $(var(i))\n")
    end
end
1: 0.0
2: 2.4015768904846784e-7
3: 1.50785789761816e-7
4: 5.5323714558751604e-8
5: 1.067059598645194e-8
8: 2.8531740381809753e-7
9: 2.178518349875181e-12
10: 3.6463511391561535e-8
12: 8.978581756069673e-9
13: 2.225255542883944e-9
15: 2.1334208117480867e-8
16: 6.926519921658443e-13
17: 1.5312982745800985e-9
24: 6.749229610940465e-7
29: 1.2017293611320204e-9
31: 1.707874061484662e-7
32: 1.7969543035492927e-7
33: 1.104222462654191e-9
37: 1.4479145023763648e-9
40: 1.984939430390262e-7
49: 1.6723242151334215e-9
56: 2.5639669134587785e-8
57: 1.0830137763592448e-9
58: 4.5784731421970874e-7
60: 8.16176005244911e-8
65: 2.6193734717542385e-9
104: 3.2508179925329365e-10
113: 2.088257528286289e-8
120: 9.614037103990904e-8
121: 2.6409037014116985e-9
122: 1.603564240416226e-7
124: 1.4375850654322247e-7
127: 2.708235179784811e-9
129: 4.537805434562285e-11
145: 7.090932120682458e-9
152: 2.7684564203310423e-7
154: 4.446460426170507e-8
157: 6.808220002016595e-9
159: 6.338869342992565e-8
160: 1.0377913238822297e-6
168: 1.053571970697729e-7
193: 6.385078601262505e-10
225: 2.0420629355926066e-8
229: 1.29750825040717e-9
232: 5.8631577914400964e-8
234: 1.620010835872509e-7
239: 5.992564289762187e-9
257: 4.2664148076160086e-8
385: 9.330785330151773e-9
388: 8.093561334264893e-7
401: 1.2327582034808143e-8
408: 1.3775346123569166e-7
413: 1.7159497368997825e-8
415: 9.69019431535846e-8
416: 1.6891119253041878e-6
421: 9.941836291993947e-9
424: 1.6829423975779585e-6
441: 1.0868289754517585e-8
444: 1.0382655030055923e-6
488: 1.8559240997940965e-7
513: 6.408912777328474e-8
705: 2.922239960522234e-7
708: 8.641818862476931e-7
733: 6.803089511462973e-7
736: 3.7090480128720713e-6
737: 2.1830126153255827e-8
741: 5.617714951240028e-7
744: 2.0538835547566247e-6
761: 2.918328786678761e-7
764: 1.4721605883543292e-6
769: 6.073159664694409e-7
833: 3.28943748507194e-7
836: 9.400011863840682e-7
861: 4.823371021199301e-7
864: 2.177962934752857e-6
869: 2.730425286072919e-7
872: 1.0820167043890086e-6
881: 1.6383379811744987e-7
888: 3.954249261521601e-6
889: 2.795444133184951e-7
890: 4.970629739022461e-7
892: 3.1970290341035076e-7
1000: 8.306363002285679e-9
1025: 2.3873333841197416e-10
1768: 1.9931300494325615e-7
1793: 1.8410736567586368e-7
1800: 7.66154639475543e-7
1802: 6.188337397599058e-7
1807: 1.0306511447215306e-7
1857: 6.714753517726496e-8
1896: 9.694711692774297e-8
1905: 2.420756496712187e-7
1913: 2.863919891504028e-7
1916: 7.440184193767592e-7
1937: 3.2789342654248177e-7
2017: 4.0223842218355667e-7
2049: 1.3078797438694954e-9
2305: 5.4670779945779684e-8
2312: 1.1822062634335761e-7
2314: 5.267996550510961e-7
2319: 2.5713191119883205e-8
2417: 1.288276161223577e-8
2433: 1.8490205581463577e-7
2449: 6.54066058081546e-8
2461: 6.133449024574238e-8
2464: 1.733303941173275e-7
2472: 3.4357097001563734e-9
2529: 7.096928855038355e-9
2792: 2.7502707036739298e-6
3073: 8.723738208658598e-10
3076: 3.8355519333863964e-8
3080: 4.046492125637285e-7
3101: 4.48153615520223e-10
3104: 7.515189020295687e-8
3109: 2.3351707709582795e-9
3112: 3.8060291542554694e-8
3129: 2.780038721462716e-10
3132: 4.37952840295822e-8
3304: 1.497099267723311e-8
3329: 2.2179417299015104e-8
3457: 2.458976713148626e-8
3460: 4.7003710011808296e-7
3473: 1.227166523407743e-8
3480: 1.1884852181649184e-7
3485: 8.33161475432735e-8
3487: 2.2853536877679304e-7
3493: 5.5624271403751925e-8
3513: 4.277733871629999e-8
3560: 4.732738521807581e-7
3585: 1.7286320075454504e-7
3588: 1.7227208847230142e-6
3592: 7.641249503764565e-7
3594: 1.1409508023945698e-6
3599: 2.3653396514852813e-7
3624: 3.3189234193274346e-6
3649: 6.38890928970502e-8
3688: 6.579179851124789e-8
3697: 1.5494455279657186e-7
3705: 1.3672483987419606e-8
3708: 1.0392572166708177e-6
3713: 1.269578984479865e-7
3729: 2.572206123381697e-7
3741: 2.921706243673055e-7
3744: 6.280704050813674e-7
3752: 2.6677781876079027e-6
3777: 1.4775148266121797e-7
3780: 5.445666410423697e-7
3784: 4.643694515894267e-7
3793: 1.939967938361251e-6
3800: 1.9722413112912866e-6
3805: 6.095236110019247e-8
3807: 2.0312418117816616e-7
3809: 7.981508547007415e-8
3812: 9.932768029113756e-8
3813: 6.726989098052332e-8
3814: 2.1850046833112867e-6
3815: 1.2121559016753712e-9
3816: 5.130463314441521e-7
3817: 6.518124781276381e-9
3820: 4.7126618875536286e-7
3821: 7.668441123901582e-8
3824: 2.129909051743053e-6
3825: 2.2417024928082872e-8
3832: 2.4408009838343956e-8
3833: 2.016529939682111e-8
3834: 3.4590110716898126e-7
3841: 4.103363701392373e-7
3905: 2.620348419302889e-7
3908: 1.1362440348252749e-6
3933: 1.3572608238795897e-7
3941: 1.0144306995629207e-7
3953: 1.505988073883599e-7
3960: 4.242270907107228e-6
3961: 3.026229944607843e-7
3962: 2.1207015172951415e-7
4072: 3.198161859483364e-8
In [55]:
# Get our oracle single figure
oracleToUse = [length(x) > 0 ? mean(x) : 0 for x in estimateOracle]
Out[55]:
4096-element Array{Real,1}:
 1.0
 0.9720988936902138
 0.9840623974961618
 0.9720291467193548
 0.9913625817827061
 0.9795678213424949
 0.99153629425895
 0.9636106976757578
 0.9917309645128152
 0.9798848056702893
 0.9917306043528243
 0.9641518473962647
 0.9992037586930189
 ⋮
 0
 0
 0
 0
 0.9499529053266064
 0
 0
 0.9234866856425643
 0
 0
 0
 0
In [56]:
# Just out of intereset we can now work out the error in the oracle using this number of shots and our
# fitting algorithm:
print("Actual error in the eigenvalues estimated\n")
avE =[]
for (ix,i) in enumerate(oracleToUse)
    if i !=0
        percentError = round((i-actualOracle[ix])/actualOracle[ix]*100,digits=6)
        push!(avE,percentError)
        print("$(string(ix,pad=4)): $(fidelityLabels(ix-1,qubits=6)): $(round(percentError,digits=3))%\n")
    end
end
print("Mean error = $(round(mean(avE),digits=4))%, std = $(round(std(avE),digits=5))(%)\n")
Actual error in the eigenvalues estimated
0001: IIIIII: 0.0%
0002: IIIIIX: 0.01%
0003: IIIIIY: 0.006%
0004: IIIIIZ: 0.003%
0005: IIIIXI: -0.004%
0006: IIIIXX: 0.016%
0007: IIIIXY: 0.013%
0008: IIIIXZ: 0.02%
0009: IIIIYI: -0.007%
0010: IIIIYX: 0.008%
0011: IIIIYY: -0.007%
0012: IIIIYZ: 0.036%
0013: IIIIZI: 0.0%
0014: IIIIZX: -0.023%
0015: IIIIZY: 0.002%
0016: IIIIZZ: 0.043%
0017: IIIXII: 0.002%
0020: IIIXIZ: 0.041%
0021: IIIXXI: -0.011%
0024: IIIXXZ: 0.025%
0025: IIIXYI: 0.003%
0026: IIIXYX: 0.02%
0028: IIIXYZ: 0.011%
0029: IIIXZI: -0.002%
0030: IIIXZX: -0.032%
0031: IIIXZY: 0.006%
0032: IIIXZZ: 0.001%
0033: IIIYII: -0.001%
0036: IIIYIZ: 0.045%
0037: IIIYXI: 0.0%
0038: IIIYXX: -0.038%
0039: IIIYXY: 0.029%
0040: IIIYXZ: -0.006%
0041: IIIYYI: 0.002%
0042: IIIYYX: -0.024%
0044: IIIYYZ: 0.005%
0045: IIIYZI: 0.004%
0047: IIIYZY: -0.007%
0048: IIIYZZ: 0.046%
0049: IIIZII: -0.0%
0052: IIIZIZ: 0.041%
0053: IIIZXI: 0.005%
0056: IIIZXZ: -0.0%
0057: IIIZYI: -0.0%
0058: IIIZYX: 0.019%
0059: IIIZYY: 0.04%
0060: IIIZYZ: -0.003%
0061: IIIZZI: -0.007%
0063: IIIZZY: -0.001%
0064: IIIZZZ: 0.078%
0065: IIXIII: -0.001%
0068: IIXIIZ: -0.009%
0072: IIXIXZ: -0.068%
0074: IIXIYX: 0.006%
0079: IIXIZY: -0.008%
0081: IIXXII: 0.009%
0088: IIXXXZ: -0.052%
0090: IIXXYX: 0.004%
0093: IIXXZI: -0.014%
0095: IIXXZY: -0.012%
0096: IIXXZZ: -0.006%
0097: IIXYII: -0.009%
0101: IIXYXI: -0.007%
0104: IIXYXZ: 0.0%
0106: IIXYYX: 0.017%
0111: IIXYZY: -0.004%
0113: IIXZII: 0.006%
0114: IIXZIX: 0.024%
0115: IIXZIY: -0.022%
0116: IIXZIZ: 0.074%
0117: IIXZXI: 0.001%
0118: IIXZXX: 0.054%
0119: IIXZXY: 0.011%
0120: IIXZXZ: 0.034%
0121: IIXZYI: -0.006%
0122: IIXZYX: 0.027%
0123: IIXZYY: -0.029%
0124: IIXZYZ: 0.003%
0125: IIXZZI: 0.004%
0126: IIXZZX: -0.027%
0127: IIXZZY: 0.007%
0128: IIXZZZ: 0.076%
0129: IIYIII: 0.002%
0132: IIYIIZ: -0.085%
0136: IIYIXZ: -0.039%
0138: IIYIYX: -0.063%
0143: IIYIZY: 0.016%
0145: IIYXII: -0.001%
0146: IIYXIX: 0.06%
0147: IIYXIY: -0.051%
0148: IIYXIZ: 0.01%
0149: IIYXXI: 0.019%
0150: IIYXXX: 0.026%
0151: IIYXXY: 0.011%
0152: IIYXXZ: 0.031%
0153: IIYXYI: -0.02%
0154: IIYXYX: -0.003%
0155: IIYXYY: -0.028%
0156: IIYXYZ: 0.049%
0157: IIYXZI: 0.003%
0158: IIYXZX: -0.011%
0159: IIYXZY: 0.014%
0160: IIYXZZ: -0.038%
0161: IIYYII: 0.017%
0165: IIYYXI: 0.008%
0168: IIYYXZ: -0.102%
0170: IIYYYX: -0.047%
0175: IIYYZY: -0.011%
0177: IIYZII: -0.023%
0184: IIYZXZ: 0.007%
0185: IIYZYI: 0.006%
0186: IIYZYX: 0.02%
0188: IIYZYZ: -0.094%
0191: IIYZZY: -0.019%
0193: IIZIII: 0.0%
0196: IIZIIZ: -0.005%
0200: IIZIXZ: 0.012%
0202: IIZIYX: 0.004%
0207: IIZIZY: -0.003%
0209: IIZXII: 0.003%
0216: IIZXXZ: 0.057%
0218: IIZXYX: 0.037%
0221: IIZXZI: -0.001%
0223: IIZXZY: -0.016%
0224: IIZXZZ: 0.011%
0225: IIZYII: 0.005%
0226: IIZYIX: 0.016%
0227: IIZYIY: -0.02%
0228: IIZYIZ: 0.095%
0229: IIZYXI: -0.0%
0230: IIZYXX: 0.052%
0231: IIZYXY: 0.022%
0232: IIZYXZ: 0.017%
0233: IIZYYI: -0.009%
0234: IIZYYX: 0.012%
0235: IIZYYY: -0.042%
0236: IIZYYZ: 0.025%
0237: IIZYZI: 0.007%
0238: IIZYZX: -0.02%
0239: IIZYZY: 0.008%
0240: IIZYZZ: 0.095%
0241: IIZZII: -0.002%
0248: IIZZXZ: 0.008%
0249: IIZZYI: 0.003%
0250: IIZZYX: 0.071%
0252: IIZZYZ: 0.007%
0255: IIZZZY: -0.008%
0257: IXIIII: 0.031%
0260: IXIIIZ: 0.028%
0264: IXIIXZ: 0.09%
0266: IXIIYX: -0.029%
0271: IXIIZY: 0.015%
0285: IXIXZI: 0.04%
0288: IXIXZZ: 0.025%
0293: IXIYXI: 0.038%
0296: IXIYXZ: -0.015%
0313: IXIZYI: 0.035%
0316: IXIZYZ: 0.003%
0321: IXXIII: 0.027%
0324: IXXIIZ: -0.006%
0349: IXXXZI: 0.017%
0352: IXXXZZ: -0.022%
0357: IXXYXI: 0.014%
0360: IXXYXZ: -0.051%
0369: IXXZII: -0.014%
0376: IXXZXZ: 0.04%
0377: IXXZYI: 0.014%
0378: IXXZYX: 0.015%
0380: IXXZYZ: -0.03%
0383: IXXZZY: -0.006%
0385: IXYIII: -0.012%
0386: IXYIIX: -0.052%
0387: IXYIIY: -0.03%
0388: IXYIIZ: 0.027%
0389: IXYIXI: 0.0%
0392: IXYIXZ: 0.086%
0393: IXYIYI: -0.002%
0396: IXYIYZ: 0.126%
0397: IXYIZI: 0.002%
0400: IXYIZZ: 0.055%
0401: IXYXII: -0.005%
0404: IXYXIZ: 0.109%
0405: IXYXXI: -0.001%
0408: IXYXXZ: 0.084%
0409: IXYXYI: 0.009%
0410: IXYXYX: -0.077%
0412: IXYXYZ: 0.075%
0413: IXYXZI: -0.019%
0414: IXYXZX: -0.068%
0415: IXYXZY: -0.01%
0416: IXYXZZ: -0.023%
0417: IXYYII: -0.003%
0420: IXYYIZ: 0.15%
0421: IXYYXI: -0.018%
0422: IXYYXX: -0.06%
0423: IXYYXY: -0.031%
0424: IXYYXZ: -0.019%
0425: IXYYYI: 0.006%
0428: IXYYYZ: 0.192%
0429: IXYYZI: 0.009%
0432: IXYYZZ: 0.096%
0433: IXYZII: 0.013%
0436: IXYZIZ: -0.006%
0437: IXYZXI: -0.009%
0440: IXYZXZ: 0.083%
0441: IXYZYI: -0.013%
0442: IXYZYX: -0.051%
0443: IXYZYY: -0.031%
0444: IXYZYZ: -0.021%
0445: IXYZZI: 0.004%
0448: IXYZZZ: 0.026%
0449: IXZIII: 0.042%
0452: IXZIIZ: 0.02%
0477: IXZXZI: 0.028%
0480: IXZXZZ: -0.023%
0481: IXZYII: -0.012%
0485: IXZYXI: 0.038%
0488: IXZYXZ: 0.002%
0490: IXZYYX: 0.01%
0495: IXZYZY: 0.005%
0505: IXZZYI: 0.04%
0508: IXZZYZ: -0.009%
0513: IYIIII: -0.039%
0516: IYIIIZ: -0.232%
0520: IYIIXZ: -0.022%
0522: IYIIYX: -0.034%
0527: IYIIZY: -0.046%
0541: IYIXZI: -0.083%
0544: IYIXZZ: -0.251%
0549: IYIYXI: -0.132%
0552: IYIYXZ: -0.263%
0569: IYIZYI: -0.096%
0572: IYIZYZ: -0.246%
0577: IYXIII: 0.06%
0580: IYXIIZ: -0.002%
0605: IYXXZI: 0.047%
0608: IYXXZZ: -0.03%
0613: IYXYXI: 0.046%
0616: IYXYXZ: -0.025%
0625: IYXZII: -0.048%
0632: IYXZXZ: 0.059%
0633: IYXZYI: 0.053%
0634: IYXZYX: -0.084%
0636: IYXZYZ: -0.033%
0639: IYXZZY: 0.037%
0641: IYYIII: -0.075%
0644: IYYIIZ: -0.215%
0657: IYYXII: -0.016%
0664: IYYXXZ: -0.024%
0666: IYYXYX: -0.005%
0669: IYYXZI: -0.084%
0671: IYYXZY: -0.05%
0672: IYYXZZ: -0.219%
0677: IYYYXI: -0.144%
0680: IYYYXZ: -0.239%
0697: IYYZYI: -0.104%
0700: IYYZYZ: -0.231%
0705: IYZIII: 0.024%
0706: IYZIIX: -0.045%
0707: IYZIIY: 0.04%
0708: IYZIIZ: 0.04%
0709: IYZIXI: 0.077%
0712: IYZIXZ: 0.112%
0713: IYZIYI: -0.02%
0716: IYZIYZ: -0.045%
0717: IYZIZI: 0.035%
0720: IYZIZZ: 0.063%
0721: IYZXII: 0.121%
0724: IYZXIZ: 0.138%
0725: IYZXXI: 0.111%
0728: IYZXXZ: 0.139%
0729: IYZXYI: 0.074%
0732: IYZXYZ: 0.174%
0733: IYZXZI: 0.03%
0734: IYZXZX: -0.101%
0735: IYZXZY: 0.032%
0736: IYZXZZ: 0.023%
0737: IYZYII: -0.029%
0740: IYZYIZ: 0.001%
0741: IYZYXI: 0.044%
0742: IYZYXX: -0.086%
0743: IYZYXY: 0.063%
0744: IYZYXZ: 0.035%
0745: IYZYYI: 0.007%
0746: IYZYYX: -0.113%
0748: IYZYYZ: 0.055%
0749: IYZYZI: 0.06%
0751: IYZYZY: 0.027%
0752: IYZYZZ: 0.198%
0753: IYZZII: 0.012%
0756: IYZZIZ: 0.041%
0757: IYZZXI: 0.077%
0760: IYZZXZ: 0.157%
0761: IYZZYI: 0.038%
0762: IYZZYX: -0.012%
0763: IYZZYY: 0.084%
0764: IYZZYZ: 0.025%
0765: IYZZZI: 0.015%
0768: IYZZZZ: 0.059%
0769: IZIIII: 0.061%
0772: IZIIIZ: -0.084%
0776: IZIIXZ: -0.007%
0778: IZIIYX: 0.1%
0783: IZIIZY: 0.03%
0797: IZIXZI: 0.033%
0800: IZIXZZ: -0.033%
0805: IZIYXI: 0.043%
0808: IZIYXZ: -0.023%
0825: IZIZYI: 0.026%
0828: IZIZYZ: -0.037%
0833: IZXIII: 0.027%
0834: IZXIIX: -0.125%
0835: IZXIIY: 0.124%
0836: IZXIIZ: 0.062%
0837: IZXIXI: 0.103%
0840: IZXIXZ: 0.176%
0841: IZXIYI: 0.017%
0844: IZXIYZ: 0.139%
0845: IZXIZI: 0.011%
0848: IZXIZZ: 0.06%
0849: IZXXII: 0.091%
0852: IZXXIZ: 0.185%
0853: IZXXXI: 0.171%
0856: IZXXXZ: 0.262%
0857: IZXXYI: 0.079%
0860: IZXXYZ: 0.172%
0861: IZXXZI: 0.019%
0862: IZXXZX: -0.09%
0863: IZXXZY: 0.128%
0864: IZXXZZ: 0.014%
0865: IZXYII: 0.014%
0868: IZXYIZ: 0.177%
0869: IZXYXI: 0.026%
0870: IZXYXX: -0.094%
0871: IZXYXY: 0.133%
0872: IZXYXZ: 0.011%
0873: IZXYYI: 0.02%
0876: IZXYYZ: 0.176%
0877: IZXYZI: 0.087%
0880: IZXYZZ: 0.218%
0881: IZXZII: 0.002%
0884: IZXZIZ: -0.017%
0885: IZXZXI: 0.139%
0888: IZXZXZ: 0.104%
0889: IZXZYI: 0.041%
0890: IZXZYX: -0.028%
0891: IZXZYY: 0.139%
0892: IZXZYZ: 0.016%
0893: IZXZZI: -0.038%
0895: IZXZZY: 0.01%
0896: IZXZZZ: 0.003%
0897: IZYIII: 0.013%
0900: IZYIIZ: -0.138%
0913: IZYXII: 0.072%
0920: IZYXXZ: 0.04%
0922: IZYXYX: 0.066%
0925: IZYXZI: 0.029%
0927: IZYXZY: -0.001%
0928: IZYXZZ: -0.155%
0933: IZYYXI: 0.028%
0936: IZYYXZ: -0.149%
0953: IZYZYI: 0.021%
0956: IZYZYZ: -0.139%
0961: IZZIII: -0.008%
0964: IZZIIZ: -0.002%
0989: IZZXZI: 0.038%
0992: IZZXZZ: 0.043%
0993: IZZYII: 0.079%
0997: IZZYXI: 0.043%
1000: IZZYXZ: 0.069%
1002: IZZYYX: 0.139%
1007: IZZYZY: 0.094%
1017: IZZZYI: 0.012%
1020: IZZZYZ: 0.039%
1025: XIIIII: 0.0%
1028: XIIIIZ: 0.004%
1032: XIIIXZ: -0.004%
1034: XIIIYX: 0.003%
1039: XIIIZY: 0.002%
1053: XIIXZI: -0.0%
1056: XIIXZZ: 0.012%
1061: XIIYXI: -0.003%
1064: XIIYXZ: 0.019%
1081: XIIZYI: 0.0%
1084: XIIZYZ: 0.006%
1137: XIXZII: 0.003%
1144: XIXZXZ: 0.033%
1146: XIXZYX: -0.016%
1151: XIXZZY: 0.011%
1169: XIYXII: -0.005%
1176: XIYXXZ: -0.006%
1178: XIYXYX: -0.036%
1183: XIYXZY: -0.016%
1249: XIZYII: -0.008%
1256: XIZYXZ: -0.003%
1258: XIZYYX: -0.03%
1263: XIZYZY: 0.019%
1281: XXIIII: 0.012%
1288: XXIIXZ: 0.087%
1290: XXIIYX: -0.031%
1295: XXIIZY: 0.017%
1393: XXXZII: -0.013%
1400: XXXZXZ: 0.037%
1402: XXXZYX: 0.014%
1407: XXXZZY: -0.014%
1409: XXYIII: -0.025%
1412: XXYIIZ: -0.154%
1425: XXYXII: -0.015%
1432: XXYXXZ: 0.053%
1434: XXYXYX: -0.077%
1437: XXYXZI: -0.03%
1439: XXYXZY: 0.013%
1440: XXYXZZ: -0.171%
1445: XXYYXI: -0.035%
1448: XXYYXZ: -0.163%
1465: XXYZYI: -0.025%
1468: XXYZYZ: -0.139%
1505: XXZYII: -0.013%
1512: XXZYXZ: 0.025%
1514: XXZYYX: 0.004%
1519: XXZYZY: -0.009%
1537: XYIIII: -0.029%
1544: XYIIXZ: 0.013%
1546: XYIIYX: -0.112%
1551: XYIIZY: 0.025%
1649: XYXZII: 0.014%
1656: XYXZXZ: 0.092%
1658: XYXZYX: -0.053%
1663: XYXZZY: 0.03%
1681: XYYXII: -0.054%
1688: XYYXXZ: 0.118%
1690: XYYXYX: -0.069%
1695: XYYXZY: 0.048%
1729: XYZIII: -0.07%
1732: XYZIIZ: -0.101%
1757: XYZXZI: -0.038%
1760: XYZXZZ: -0.143%
1761: XYZYII: 0.005%
1765: XYZYXI: -0.032%
1768: XYZYXZ: -0.081%
1770: XYZYYX: -0.085%
1775: XYZYZY: -0.007%
1785: XYZZYI: -0.039%
1788: XYZZYZ: -0.096%
1793: XZIIII: 0.009%
1794: XZIIIX: -0.034%
1795: XZIIIY: -0.098%
1796: XZIIIZ: 0.031%
1797: XZIIXI: -0.032%
1798: XZIIXX: 0.011%
1799: XZIIXY: -0.01%
1800: XZIIXZ: 0.057%
1801: XZIIYI: -0.046%
1802: XZIIYX: 0.051%
1803: XZIIYY: -0.103%
1804: XZIIYZ: 0.103%
1805: XZIIZI: 0.038%
1806: XZIIZX: 0.048%
1807: XZIIZY: -0.021%
1808: XZIIZZ: 0.03%
1809: XZIXII: 0.015%
1816: XZIXXZ: -0.022%
1818: XZIXYX: 0.052%
1823: XZIXZY: -0.066%
1825: XZIYII: -0.008%
1832: XZIYXZ: 0.048%
1834: XZIYYX: 0.111%
1839: XZIYZY: -0.067%
1841: XZIZII: 0.097%
1848: XZIZXZ: 0.144%
1850: XZIZYX: 0.245%
1855: XZIZZY: -0.013%
1857: XZXIII: -0.031%
1860: XZXIIZ: -0.074%
1864: XZXIXZ: 0.056%
1866: XZXIYX: 0.076%
1871: XZXIZY: -0.037%
1873: XZXXII: -0.015%
1880: XZXXXZ: 0.029%
1882: XZXXYX: 0.074%
1885: XZXXZI: -0.038%
1887: XZXXZY: -0.076%
1888: XZXXZZ: -0.089%
1889: XZXYII: -0.054%
1893: XZXYXI: -0.039%
1896: XZXYXZ: -0.037%
1898: XZXYYX: -0.092%
1903: XZXYZY: -0.04%
1905: XZXZII: -0.03%
1906: XZXZIX: -0.055%
1907: XZXZIY: -0.113%
1908: XZXZIZ: -0.032%
1909: XZXZXI: -0.016%
1910: XZXZXX: -0.056%
1911: XZXZXY: 0.017%
1912: XZXZXZ: 0.032%
1913: XZXZYI: -0.05%
1914: XZXZYX: -0.051%
1915: XZXZYY: -0.169%
1916: XZXZYZ: 0.044%
1917: XZXZZI: 0.022%
1918: XZXZZX: 0.059%
1919: XZXZZY: 0.068%
1920: XZXZZZ: -0.024%
1921: XZYIII: -0.066%
1928: XZYIXZ: 0.092%
1930: XZYIYX: 0.052%
1935: XZYIZY: -0.057%
1937: XZYXII: -0.035%
1938: XZYXIX: -0.091%
1939: XZYXIY: -0.168%
1940: XZYXIZ: -0.083%
1941: XZYXXI: -0.062%
1942: XZYXXX: -0.036%
1943: XZYXXY: -0.023%
1944: XZYXXZ: 0.02%
1945: XZYXYI: -0.098%
1946: XZYXYX: -0.042%
1947: XZYXYY: -0.171%
1948: XZYXYZ: 0.135%
1949: XZYXZI: -0.036%
1950: XZYXZX: 0.087%
1951: XZYXZY: -0.02%
1952: XZYXZZ: -0.067%
1953: XZYYII: -0.049%
1960: XZYYXZ: -0.019%
1962: XZYYYX: 0.044%
1967: XZYYZY: -0.115%
1969: XZYZII: -0.005%
1976: XZYZXZ: 0.014%
1978: XZYZYX: -0.022%
1983: XZYZZY: -0.029%
1985: XZZIII: 0.033%
1992: XZZIXZ: 0.201%
1994: XZZIYX: 0.175%
1999: XZZIZY: 0.05%
2001: XZZXII: 0.009%
2008: XZZXXZ: -0.09%
2010: XZZXYX: -0.096%
2015: XZZXZY: -0.138%
2017: XZZYII: -0.0%
2018: XZZYIX: -0.027%
2019: XZZYIY: -0.108%
2020: XZZYIZ: 0.019%
2021: XZZYXI: 0.014%
2022: XZZYXX: -0.01%
2023: XZZYXY: 0.034%
2024: XZZYXZ: -0.002%
2025: XZZYYI: -0.096%
2026: XZZYYX: -0.053%
2027: XZZYYY: -0.168%
2028: XZZYYZ: 0.145%
2029: XZZYZI: 0.05%
2030: XZZYZX: 0.073%
2031: XZZYZY: 0.051%
2032: XZZYZZ: 0.027%
2033: XZZZII: 0.072%
2040: XZZZXZ: 0.161%
2042: XZZZYX: 0.191%
2047: XZZZZY: -0.033%
2049: YIIIII: 0.0%
2052: YIIIIZ: 0.043%
2056: YIIIXZ: -0.003%
2058: YIIIYX: -0.016%
2063: YIIIZY: 0.023%
2077: YIIXZI: -0.003%
2080: YIIXZZ: 0.084%
2085: YIIYXI: -0.002%
2088: YIIYXZ: 0.069%
2105: YIIZYI: -0.002%
2108: YIIZYZ: 0.047%
2161: YIXZII: 0.011%
2168: YIXZXZ: 0.011%
2170: YIXZYX: 0.014%
2175: YIXZZY: 0.007%
2193: YIYXII: -0.008%
2200: YIYXXZ: -0.006%
2202: YIYXYX: -0.002%
2207: YIYXZY: 0.002%
2273: YIZYII: 0.014%
2280: YIZYXZ: 0.021%
2282: YIZYYX: -0.018%
2287: YIZYZY: 0.011%
2305: YXIIII: 0.004%
2306: YXIIIX: 0.033%
2307: YXIIIY: 0.004%
2308: YXIIIZ: 0.125%
2309: YXIIXI: -0.021%
2310: YXIIXX: -0.0%
2311: YXIIXY: 0.062%
2312: YXIIXZ: 0.002%
2313: YXIIYI: 0.007%
2314: YXIIYX: 0.024%
2315: YXIIYY: -0.005%
2316: YXIIYZ: 0.138%
2317: YXIIZI: 0.024%
2318: YXIIZX: 0.014%
2319: YXIIZY: 0.005%
2320: YXIIZZ: 0.138%
2321: YXIXII: -0.032%
2328: YXIXXZ: -0.014%
2330: YXIXYX: -0.114%
2335: YXIXZY: 0.018%
2337: YXIYII: 0.018%
2344: YXIYXZ: 0.116%
2346: YXIYYX: 0.029%
2351: YXIYZY: 0.031%
2353: YXIZII: 0.009%
2360: YXIZXZ: -0.027%
2362: YXIZYX: 0.037%
2367: YXIZZY: -0.023%
2369: YXXIII: -0.03%
2376: YXXIXZ: -0.015%
2378: YXXIYX: -0.111%
2383: YXXIZY: 0.002%
2385: YXXXII: -0.024%
2392: YXXXXZ: -0.015%
2394: YXXXYX: -0.118%
2399: YXXXZY: 0.032%
2401: YXXYII: -0.007%
2408: YXXYXZ: 0.086%
2410: YXXYYX: 0.045%
2415: YXXYZY: -0.0%
2417: YXXZII: -0.0%
2418: YXXZIX: 0.056%
2419: YXXZIY: -0.036%
2420: YXXZIZ: 0.036%
2421: YXXZXI: -0.006%
2422: YXXZXX: 0.031%
2423: YXXZXY: 0.067%
2424: YXXZXZ: 0.131%
2425: YXXZYI: -0.027%
2426: YXXZYX: 0.014%
2427: YXXZYY: 0.006%
2428: YXXZYZ: 0.133%
2429: YXXZZI: -0.0%
2430: YXXZZX: 0.053%
2431: YXXZZY: -0.028%
2432: YXXZZZ: 0.072%
2433: YXYIII: 0.006%
2436: YXYIIZ: 0.044%
2440: YXYIXZ: -0.032%
2442: YXYIYX: 0.027%
2447: YXYIZY: 0.018%
2449: YXYXII: 0.004%
2450: YXYXIX: 0.048%
2451: YXYXIY: 0.007%
2452: YXYXIZ: 0.127%
2453: YXYXXI: -0.01%
2454: YXYXXX: 0.038%
2455: YXYXXY: 0.08%
2456: YXYXXZ: 0.008%
2457: YXYXYI: -0.004%
2458: YXYXYX: -0.025%
2459: YXYXYY: -0.007%
2460: YXYXYZ: 0.2%
2461: YXYXZI: 0.005%
2462: YXYXZX: 0.038%
2463: YXYXZY: 0.003%
2464: YXYXZZ: 0.116%
2465: YXYYII: 0.014%
2469: YXYYXI: -0.02%
2472: YXYYXZ: 0.087%
2474: YXYYYX: -0.023%
2479: YXYYZY: 0.004%
2481: YXYZII: -0.033%
2488: YXYZXZ: 0.108%
2489: YXYZYI: -0.015%
2490: YXYZYX: 0.062%
2492: YXYZYZ: 0.061%
2495: YXYZZY: -0.023%
2497: YXZIII: 0.02%
2504: YXZIXZ: -0.051%
2506: YXZIYX: 0.075%
2511: YXZIZY: -0.019%
2513: YXZXII: -0.021%
2520: YXZXXZ: 0.025%
2522: YXZXYX: 0.064%
2527: YXZXZY: -0.017%
2529: YXZYII: 0.001%
2530: YXZYIX: 0.034%
2531: YXZYIY: -0.005%
2532: YXZYIZ: 0.082%
2533: YXZYXI: -0.002%
2534: YXZYXX: 0.02%
2535: YXZYXY: 0.063%
2536: YXZYXZ: 0.103%
2537: YXZYYI: -0.025%
2538: YXZYYX: 0.004%
2539: YXZYYY: 0.025%
2540: YXZYYZ: 0.129%
2541: YXZYZI: 0.004%
2542: YXZYZX: 0.044%
2543: YXZYZY: -0.018%
2544: YXZYZZ: 0.108%
2545: YXZZII: 0.004%
2552: YXZZXZ: 0.002%
2554: YXZZYX: 0.013%
2559: YXZZZY: -0.036%
2561: YYIIII: -0.015%
2568: YYIIXZ: -0.023%
2570: YYIIYX: 0.014%
2575: YYIIZY: -0.051%
2673: YYXZII: -0.062%
2680: YYXZXZ: 0.048%
2682: YYXZYX: -0.091%
2687: YYXZZY: 0.035%
2705: YYYXII: -0.031%
2712: YYYXXZ: -0.035%
2714: YYYXYX: -0.014%
2719: YYYXZY: -0.064%
2753: YYZIII: 0.115%
2756: YYZIIZ: 0.186%
2781: YYZXZI: 0.182%
2784: YYZXZZ: 0.3%
2785: YYZYII: -0.043%
2789: YYZYXI: 0.174%
2792: YYZYXZ: 0.166%
2794: YYZYYX: -0.106%
2799: YYZYZY: 0.027%
2809: YYZZYI: 0.134%
2812: YYZZYZ: 0.207%
2817: YZIIII: 0.01%
2824: YZIIXZ: -0.06%
2826: YZIIYX: -0.122%
2831: YZIIZY: -0.041%
2881: YZXIII: 0.113%
2884: YZXIIZ: 0.124%
2909: YZXXZI: 0.131%
2912: YZXXZZ: 0.24%
2917: YZXYXI: 0.089%
2920: YZXYXZ: 0.188%
2929: YZXZII: -0.039%
2936: YZXZXZ: 0.075%
2937: YZXZYI: 0.105%
2938: YZXZYX: -0.063%
2940: YZXZYZ: 0.127%
2943: YZXZZY: 0.004%
2961: YZYXII: 0.047%
2968: YZYXXZ: 0.068%
2970: YZYXYX: -0.041%
2975: YZYXZY: 0.027%
3041: YZZYII: -0.069%
3048: YZZYXZ: -0.005%
3050: YZZYYX: -0.111%
3055: YZZYZY: 0.011%
3073: ZIIIII: 0.001%
3074: ZIIIIX: -0.008%
3075: ZIIIIY: 0.035%
3076: ZIIIIZ: 0.0%
3077: ZIIIXI: -0.006%
3080: ZIIIXZ: 0.026%
3081: ZIIIYI: -0.007%
3082: ZIIIYX: 0.027%
3084: ZIIIYZ: 0.045%
3085: ZIIIZI: 0.001%
3087: ZIIIZY: 0.024%
3088: ZIIIZZ: 0.033%
3089: ZIIXII: -0.003%
3092: ZIIXIZ: 0.047%
3093: ZIIXXI: -0.005%
3096: ZIIXXZ: 0.094%
3097: ZIIXYI: 0.001%
3100: ZIIXYZ: -0.017%
3101: ZIIXZI: -0.001%
3102: ZIIXZX: -0.01%
3103: ZIIXZY: 0.035%
3104: ZIIXZZ: -0.01%
3105: ZIIYII: -0.004%
3108: ZIIYIZ: 0.066%
3109: ZIIYXI: -0.0%
3110: ZIIYXX: -0.017%
3111: ZIIYXY: 0.028%
3112: ZIIYXZ: -0.004%
3113: ZIIYYI: 0.003%
3116: ZIIYYZ: 0.007%
3117: ZIIYZI: 0.003%
3120: ZIIYZZ: 0.025%
3121: ZIIZII: -0.001%
3124: ZIIZIZ: 0.028%
3125: ZIIZXI: 0.003%
3128: ZIIZXZ: 0.006%
3129: ZIIZYI: 0.001%
3130: ZIIZYX: -0.008%
3131: ZIIZYY: 0.043%
3132: ZIIZYZ: -0.005%
3133: ZIIZZI: -0.004%
3136: ZIIZZZ: 0.064%
3137: ZIXIII: 0.001%
3140: ZIXIIZ: -0.013%
3165: ZIXXZI: -0.007%
3168: ZIXXZZ: -0.007%
3173: ZIXYXI: -0.002%
3176: ZIXYXZ: -0.001%
3185: ZIXZII: 0.019%
3192: ZIXZXZ: -0.001%
3193: ZIXZYI: -0.005%
3194: ZIXZYX: 0.079%
3196: ZIXZYZ: -0.022%
3199: ZIXZZY: 0.016%
3201: ZIYIII: -0.001%
3204: ZIYIIZ: -0.067%
3217: ZIYXII: 0.001%
3224: ZIYXXZ: 0.014%
3226: ZIYXYX: 0.034%
3229: ZIYXZI: 0.005%
3231: ZIYXZY: 0.049%
3232: ZIYXZZ: -0.097%
3237: ZIYYXI: 0.006%
3240: ZIYYXZ: -0.059%
3257: ZIYZYI: 0.004%
3260: ZIYZYZ: -0.073%
3265: ZIZIII: -0.001%
3268: ZIZIIZ: 0.013%
3293: ZIZXZI: -0.005%
3296: ZIZXZZ: 0.026%
3297: ZIZYII: 0.014%
3301: ZIZYXI: -0.005%
3304: ZIZYXZ: 0.002%
3306: ZIZYYX: 0.054%
3311: ZIZYZY: -0.001%
3321: ZIZZYI: -0.001%
3324: ZIZZYZ: 0.028%
3329: ZXIIII: 0.031%
3332: ZXIIIZ: 0.023%
3336: ZXIIXZ: 0.056%
3338: ZXIIYX: 0.024%
3343: ZXIIZY: 0.032%
3357: ZXIXZI: 0.041%
3360: ZXIXZZ: 0.018%
3365: ZXIYXI: 0.038%
3368: ZXIYXZ: -0.027%
3385: ZXIZYI: 0.032%
3388: ZXIZYZ: -0.003%
3393: ZXXIII: 0.025%
3396: ZXXIIZ: -0.012%
3421: ZXXXZI: 0.02%
3424: ZXXXZZ: -0.03%
3429: ZXXYXI: 0.016%
3432: ZXXYXZ: -0.061%
3441: ZXXZII: 0.047%
3448: ZXXZXZ: 0.125%
3449: ZXXZYI: 0.014%
3450: ZXXZYX: 0.012%
3452: ZXXZYZ: -0.038%
3455: ZXXZZY: 0.049%
3457: ZXYIII: -0.007%
3458: ZXYIIX: -0.036%
3459: ZXYIIY: -0.024%
3460: ZXYIIZ: 0.048%
3461: ZXYIXI: 0.011%
3464: ZXYIXZ: 0.092%
3465: ZXYIYI: -0.005%
3468: ZXYIYZ: 0.143%
3469: ZXYIZI: 0.01%
3472: ZXYIZZ: 0.02%
3473: ZXYXII: 0.027%
3476: ZXYXIZ: 0.108%
3477: ZXYXXI: 0.008%
3480: ZXYXXZ: 0.093%
3481: ZXYXYI: 0.016%
3482: ZXYXYX: 0.032%
3484: ZXYXYZ: 0.047%
3485: ZXYXZI: -0.017%
3486: ZXYXZX: -0.049%
3487: ZXYXZY: 0.012%
3488: ZXYXZZ: 0.019%
3489: ZXYYII: -0.005%
3492: ZXYYIZ: 0.168%
3493: ZXYYXI: -0.012%
3494: ZXYYXX: -0.036%
3495: ZXYYXY: -0.025%
3496: ZXYYXZ: 0.018%
3497: ZXYYYI: 0.004%
3500: ZXYYYZ: 0.218%
3501: ZXYYZI: 0.015%
3504: ZXYYZZ: 0.069%
3505: ZXYZII: 0.021%
3508: ZXYZIZ: -0.02%
3509: ZXYZXI: -0.009%
3512: ZXYZXZ: 0.046%
3513: ZXYZYI: -0.009%
3514: ZXYZYX: -0.027%
3515: ZXYZYY: -0.024%
3516: ZXYZYZ: 0.009%
3517: ZXYZZI: 0.015%
3520: ZXYZZZ: 0.008%
3521: ZXZIII: 0.044%
3524: ZXZIIZ: 0.021%
3549: ZXZXZI: 0.03%
3552: ZXZXZZ: -0.029%
3553: ZXZYII: 0.038%
3557: ZXZYXI: 0.037%
3560: ZXZYXZ: 0.012%
3562: ZXZYYX: 0.018%
3567: ZXZYZY: 0.072%
3577: ZXZZYI: 0.044%
3580: ZXZZYZ: -0.012%
3585: ZYIIII: 0.011%
3586: ZYIIIX: 0.104%
3587: ZYIIIY: 0.007%
3588: ZYIIIZ: -0.118%
3589: ZYIIXI: 0.041%
3590: ZYIIXX: 0.072%
3591: ZYIIXY: 0.087%
3592: ZYIIXZ: 0.051%
3593: ZYIIYI: 0.039%
3594: ZYIIYX: 0.038%
3595: ZYIIYY: 0.008%
3596: ZYIIYZ: 0.108%
3597: ZYIIZI: 0.037%
3598: ZYIIZX: 0.02%
3599: ZYIIZY: 0.018%
3600: ZYIIZZ: -0.029%
3601: ZYIXII: 0.034%
3608: ZYIXXZ: 0.17%
3610: ZYIXYX: 0.045%
3613: ZYIXZI: -0.093%
3615: ZYIXZY: 0.047%
3616: ZYIXZZ: -0.238%
3617: ZYIYII: -0.046%
3621: ZYIYXI: -0.14%
3624: ZYIYXZ: -0.114%
3626: ZYIYYX: -0.039%
3631: ZYIYZY: -0.043%
3633: ZYIZII: 0.049%
3640: ZYIZXZ: 0.154%
3641: ZYIZYI: -0.1%
3642: ZYIZYX: 0.129%
3644: ZYIZYZ: -0.23%
3647: ZYIZZY: 0.026%
3649: ZYXIII: 0.03%
3652: ZYXIIZ: -0.016%
3656: ZYXIXZ: 0.138%
3658: ZYXIYX: 0.07%
3663: ZYXIZY: 0.054%
3665: ZYXXII: 0.018%
3672: ZYXXXZ: 0.189%
3674: ZYXXYX: 0.092%
3677: ZYXXZI: 0.031%
3679: ZYXXZY: 0.045%
3680: ZYXXZZ: -0.052%
3681: ZYXYII: 0.017%
3685: ZYXYXI: 0.03%
3688: ZYXYXZ: -0.066%
3690: ZYXYYX: -0.097%
3695: ZYXYZY: -0.111%
3697: ZYXZII: -0.009%
3698: ZYXZIX: 0.025%
3699: ZYXZIY: 0.005%
3700: ZYXZIZ: -0.131%
3701: ZYXZXI: -0.049%
3702: ZYXZXX: 0.096%
3703: ZYXZXY: 0.057%
3704: ZYXZXZ: 0.12%
3705: ZYXZYI: 0.051%
3706: ZYXZYX: 0.013%
3707: ZYXZYY: -0.038%
3708: ZYXZYZ: 0.033%
3709: ZYXZZI: -0.06%
3710: ZYXZZX: 0.088%
3711: ZYXZZY: 0.059%
3712: ZYXZZZ: -0.128%
3713: ZYYIII: -0.058%
3716: ZYYIIZ: -0.207%
3720: ZYYIXZ: 0.016%
3722: ZYYIYX: 0.033%
3727: ZYYIZY: -0.056%
3729: ZYYXII: 0.006%
3730: ZYYXIX: 0.062%
3731: ZYYXIY: -0.051%
3732: ZYYXIZ: -0.098%
3733: ZYYXXI: 0.02%
3734: ZYYXXX: 0.034%
3735: ZYYXXY: 0.083%
3736: ZYYXXZ: 0.08%
3737: ZYYXYI: -0.003%
3738: ZYYXYX: 0.008%
3739: ZYYXYY: -0.016%
3740: ZYYXYZ: 0.124%
3741: ZYYXZI: -0.053%
3742: ZYYXZX: 0.069%
3743: ZYYXZY: 0.079%
3744: ZYYXZZ: -0.151%
3745: ZYYYII: -0.07%
3749: ZYYYXI: -0.154%
3752: ZYYYXZ: -0.11%
3754: ZYYYYX: -0.1%
3759: ZYYYZY: -0.068%
3761: ZYYZII: 0.017%
3768: ZYYZXZ: -0.077%
3769: ZYYZYI: -0.109%
3770: ZYYZYX: -0.073%
3772: ZYYZYZ: -0.22%
3775: ZYYZZY: -0.086%
3777: ZYZIII: 0.03%
3778: ZYZIIX: -0.027%
3779: ZYZIIY: 0.031%
3780: ZYZIIZ: 0.037%
3781: ZYZIXI: 0.096%
3784: ZYZIXZ: 0.143%
3785: ZYZIYI: -0.038%
3786: ZYZIYX: 0.182%
3788: ZYZIYZ: -0.036%
3789: ZYZIZI: 0.026%
3791: ZYZIZY: -0.001%
3792: ZYZIZZ: 0.049%
3793: ZYZXII: 0.03%
3796: ZYZXIZ: 0.117%
3797: ZYZXXI: 0.128%
3800: ZYZXXZ: 0.009%
3801: ZYZXYI: 0.066%
3802: ZYZXYX: -0.155%
3804: ZYZXYZ: 0.164%
3805: ZYZXZI: 0.019%
3806: ZYZXZX: -0.075%
3807: ZYZXZY: -0.004%
3808: ZYZXZZ: -0.062%
3809: ZYZYII: 0.008%
3810: ZYZYIX: 0.054%
3811: ZYZYIY: -0.011%
3812: ZYZYIZ: -0.011%
3813: ZYZYXI: 0.039%
3814: ZYZYXX: 0.056%
3815: ZYZYXY: 0.058%
3816: ZYZYXZ: 0.009%
3817: ZYZYYI: 0.006%
3818: ZYZYYX: 0.018%
3819: ZYZYYY: -0.027%
3820: ZYZYYZ: 0.129%
3821: ZYZYZI: 0.041%
3822: ZYZYZX: 0.143%
3823: ZYZYZY: 0.055%
3824: ZYZYZZ: 0.084%
3825: ZYZZII: 0.013%
3828: ZYZZIZ: 0.038%
3829: ZYZZXI: 0.059%
3832: ZYZZXZ: 0.161%
3833: ZYZZYI: 0.042%
3834: ZYZZYX: 0.052%
3835: ZYZZYY: 0.079%
3836: ZYZZYZ: -0.002%
3837: ZYZZZI: 0.024%
3839: ZYZZZY: 0.017%
3840: ZYZZZZ: 0.056%
3841: ZZIIII: 0.061%
3844: ZZIIIZ: -0.063%
3848: ZZIIXZ: -0.023%
3850: ZZIIYX: 0.091%
3855: ZZIIZY: 0.024%
3869: ZZIXZI: 0.036%
3872: ZZIXZZ: -0.016%
3877: ZZIYXI: 0.049%
3880: ZZIYXZ: -0.017%
3897: ZZIZYI: 0.038%
3900: ZZIZYZ: -0.024%
3905: ZZXIII: 0.027%
3906: ZZXIIX: -0.093%
3907: ZZXIIY: 0.133%
3908: ZZXIIZ: 0.076%
3909: ZZXIXI: 0.106%
3912: ZZXIXZ: 0.151%
3913: ZZXIYI: 0.01%
3916: ZZXIYZ: 0.159%
3917: ZZXIZI: 0.001%
3920: ZZXIZZ: 0.034%
3921: ZZXXII: 0.115%
3924: ZZXXIZ: 0.18%
3925: ZZXXXI: 0.183%
3928: ZZXXXZ: 0.238%
3929: ZZXXYI: 0.073%
3932: ZZXXYZ: 0.186%
3933: ZZXXZI: 0.005%
3934: ZZXXZX: -0.05%
3935: ZZXXZY: 0.14%
3936: ZZXXZZ: -0.091%
3937: ZZXYII: 0.012%
3940: ZZXYIZ: 0.202%
3941: ZZXYXI: 0.032%
3942: ZZXYXX: -0.056%
3943: ZZXYXY: 0.147%
3944: ZZXYXZ: -0.069%
3945: ZZXYYI: 0.01%
3948: ZZXYYZ: 0.205%
3949: ZZXYZI: 0.089%
3952: ZZXYZZ: 0.238%
3953: ZZXZII: -0.01%
3956: ZZXZIZ: -0.05%
3957: ZZXZXI: 0.132%
3960: ZZXZXZ: 0.097%
3961: ZZXZYI: 0.04%
3962: ZZXZYX: -0.015%
3963: ZZXZYY: 0.149%
3964: ZZXZYZ: -0.034%
3965: ZZXZZI: -0.05%
3967: ZZXZZY: -0.0%
3968: ZZXZZZ: -0.03%
3969: ZZYIII: -0.001%
3972: ZZYIIZ: -0.15%
3985: ZZYXII: 0.07%
3992: ZZYXXZ: 0.032%
3994: ZZYXYX: 0.07%
3997: ZZYXZI: 0.024%
3999: ZZYXZY: 0.001%
4000: ZZYXZZ: -0.165%
4005: ZZYYXI: 0.022%
4008: ZZYYXZ: -0.157%
4025: ZZYZYI: 0.007%
4028: ZZYZYZ: -0.148%
4033: ZZZIII: -0.0%
4036: ZZZIIZ: 0.007%
4061: ZZZXZI: 0.04%
4064: ZZZXZZ: 0.049%
4065: ZZZYII: 0.065%
4069: ZZZYXI: 0.05%
4072: ZZZYXZ: 0.056%
4074: ZZZYYX: 0.131%
4079: ZZZYZY: 0.079%
4089: ZZZZYI: 0.025%
4092: ZZZZYZ: 0.039%
Mean error = 0.0116%, std = 0.07076(%)

Running the peeling decoder

  • we don't have all the values in the oracle, but we have enough
In [57]:
# Use the patterns to create the listOfPs

# What is this?? Well basically it tells us which Pauli error fell into which bin, given our choice of experiment
# So the first vector (listOfPs[1][1])
listOfPs=[]
for p in paulisAll
    hMap = []
    # Because here we use right hand least significant - we just reverse the order we stored the experiments.
    for i in reverse(p)
       #print("Length $(length(i))\n")
       if length(i) == 2
            push!(hMap,twoPattern(i))
        elseif length(i) == 4
            push!(hMap,fourPattern([i])[1])
        else # Assume a binary bit pattern
            push!(hMap,[length(i)])
        end
    end
    push!(listOfPs,hMap)
end
listOfPs
Out[57]:
2-element Array{Any,1}:
 Any[[["0000", "0110", "1101", "1011"], ["1100", "1010", "0001", "0111"], ["1000", "1110", "0101", "0011"], ["0100", "0010", "1001", "1111"]], [["0000", "0110", "1101", "1011"], ["1100", "1010", "0001", "0111"], ["1000", "1110", "0101", "0011"], ["0100", "0010", "1001", "1111"]], [["0000", "0110", "1101", "1011"], ["1100", "1010", "0001", "0111"], ["1000", "1110", "0101", "0011"], ["0100", "0010", "1001", "1111"]]]
 Any[[["00", "11"], ["10", "01"]], [["0000", "1110", "1001", "0111"], ["1100", "0010", "0101", "1011"], ["0100", "1010", "1101", "0011"], ["1000", "0110", "0001", "1111"]], [["0000", "0110", "1101", "1011"], ["1100", "1010", "0001", "0111"], ["1000", "1110", "0101", "0011"], ["0100", "0010", "1001", "1111"]], [["00", "11"], ["10", "01"]]]
In [58]:
samples = []
for (ix,x) in enumerate(paulisAll)
    # Similarly if we reverse above (right hand least significant, then we reverse here)
   push!(samples,[[y for y in generateFromPVecSamples4N(reverse(x),d)] for d in ds])
end
samples
Out[58]:
2-element Array{Any,1}:
 [[0, 14, 7, 9, 224, 238, 231, 233, 112, 126  …  2535, 2537, 2416, 2430, 2423, 2425, 2448, 2462, 2455, 2457], [2048, 2062, 2055, 2057, 2272, 2286, 2279, 2281, 2160, 2174  …  487, 489, 368, 382, 375, 377, 400, 414, 407, 409], [1024, 1038, 1031, 1033, 1248, 1262, 1255, 1257, 1136, 1150  …  3559, 3561, 3440, 3454, 3447, 3449, 3472, 3486, 3479, 3481], [512, 526, 519, 521, 736, 750, 743, 745, 624, 638  …  3047, 3049, 2928, 2942, 2935, 2937, 2960, 2974, 2967, 2969], [256, 270, 263, 265, 480, 494, 487, 489, 368, 382  …  2279, 2281, 2160, 2174, 2167, 2169, 2192, 2206, 2199, 2201], [128, 142, 135, 137, 96, 110, 103, 105, 240, 254  …  2407, 2409, 2544, 2558, 2551, 2553, 2320, 2334, 2327, 2329], [64, 78, 71, 73, 160, 174, 167, 169, 48, 62  …  2471, 2473, 2352, 2366, 2359, 2361, 2512, 2526, 2519, 2521], [32, 46, 39, 41, 192, 206, 199, 201, 80, 94  …  2503, 2505, 2384, 2398, 2391, 2393, 2480, 2494, 2487, 2489], [16, 30, 23, 25, 240, 254, 247, 249, 96, 110  …  2551, 2553, 2400, 2414, 2407, 2409, 2432, 2446, 2439, 2441], [8, 6, 15, 1, 232, 230, 239, 225, 120, 118  …  2543, 2529, 2424, 2422, 2431, 2417, 2456, 2454, 2463, 2449], [4, 10, 3, 13, 228, 234, 227, 237, 116, 122  …  2531, 2541, 2420, 2426, 2419, 2429, 2452, 2458, 2451, 2461], [2, 12, 5, 11, 226, 236, 229, 235, 114, 124  …  2533, 2539, 2418, 2428, 2421, 2427, 2450, 2460, 2453, 2459], [1, 15, 6, 8, 225, 239, 230, 232, 113, 127  …  2534, 2536, 2417, 2431, 2422, 2424, 2449, 2463, 2454, 2456]]
 [[0, 3, 56, 59, 28, 31, 36, 39, 832, 835  …  3812, 3815, 3456, 3459, 3512, 3515, 3484, 3487, 3492, 3495], [2048, 2051, 2104, 2107, 2076, 2079, 2084, 2087, 2880, 2883  …  1764, 1767, 1408, 1411, 1464, 1467, 1436, 1439, 1444, 1447], [1024, 1027, 1080, 1083, 1052, 1055, 1060, 1063, 1856, 1859  …  2788, 2791, 2432, 2435, 2488, 2491, 2460, 2463, 2468, 2471], [512, 515, 568, 571, 540, 543, 548, 551, 320, 323  …  3300, 3303, 3968, 3971, 4024, 4027, 3996, 3999, 4004, 4007], [256, 259, 312, 315, 284, 287, 292, 295, 576, 579  …  4068, 4071, 3200, 3203, 3256, 3259, 3228, 3231, 3236, 3239], [128, 131, 184, 187, 156, 159, 164, 167, 960, 963  …  3684, 3687, 3328, 3331, 3384, 3387, 3356, 3359, 3364, 3367], [64, 67, 120, 123, 92, 95, 100, 103, 768, 771  …  3748, 3751, 3520, 3523, 3576, 3579, 3548, 3551, 3556, 3559], [32, 35, 24, 27, 60, 63, 4, 7, 864, 867  …  3780, 3783, 3488, 3491, 3480, 3483, 3516, 3519, 3460, 3463], [16, 19, 40, 43, 12, 15, 52, 55, 848, 851  …  3828, 3831, 3472, 3475, 3496, 3499, 3468, 3471, 3508, 3511], [8, 11, 48, 51, 20, 23, 44, 47, 840, 843  …  3820, 3823, 3464, 3467, 3504, 3507, 3476, 3479, 3500, 3503], [4, 7, 60, 63, 24, 27, 32, 35, 836, 839  …  3808, 3811, 3460, 3463, 3516, 3519, 3480, 3483, 3488, 3491], [2, 1, 58, 57, 30, 29, 38, 37, 834, 833  …  3814, 3813, 3458, 3457, 3514, 3513, 3486, 3485, 3494, 3493], [1, 2, 57, 58, 29, 30, 37, 38, 833, 834  …  3813, 3814, 3457, 3458, 3513, 3514, 3485, 3486, 3493, 3494]]
In [59]:
## This is just to remind ourselves these are the Paulis we might hope to recover
## In particular we hid a high weight Pauli with an error of 0.004, it appears below as 'number' 816
for i = 1:4096
    if (dist[i]>0.0001)
        print("$(string(i-1,base=2,pad=12)): $i $(dist[i])\n")
    end
end
000000000000: 1 0.9566049496644947
000000000001: 2 0.009742443708564856
000000000010: 3 0.0038969774834259428
000000000100: 5 0.0002882968036207967
000000010000: 17 0.00019223633173193815
000000110000: 49 0.0002883544975979071
000010000000: 129 0.0002882968036207967
000100000000: 257 0.003936905531411864
001000000000: 513 0.01968452765705932
001000000001: 514 0.0001996402399296077
001100101111: 816 0.004
100000000000: 2049 0.0002882968036207967
In [60]:
([oracleToUse[x+1] for x in samples[2][1]])
Out[60]:
64-element Array{Float64,1}:
 1.0
 0.9720291467193548
 0.9991998946835859
 0.9711926086643352
 0.9981857588412699
 0.9702571630197817
 0.9986028494533448
 0.9705848390270296
 0.9520673532412371
 0.925736192005811
 0.9514365391411761
 0.9245680352867948
 0.9502737041497508
 ⋮
 0.9489201111745996
 0.9213777820094172
 0.9494913862208747
 0.9224038681901503
 0.9825468776014576
 0.9553338422655632
 0.981732052552941
 0.9541915237156421
 0.9806621081060998
 0.9533235904595504
 0.9811036366343943
 0.953703278609339
In [61]:
using LinearAlgebra
maxPass = 200
# singletons is when 'noise' threshold below which we declare we have found a singletons
# It will be related to the measurment accuracy and the number of bins
# Here we base it off the shotsToDo variance, on the basis of our hoped for recovery

# We start that one low and then slowly increase it, meaning we are more likely to accept
# If you have a certain probability distribution and this ansatz is not working, set it
# so that you get a reasonable number of hits in the first round.
singletons = (0.001*.999)/30000
singletonsInc = singletons/2

# Zeros is set high - we don't want to accept bins with very low numbers as they are probably just noise
# If the (sum(mean - value)^2) for all the offsets is below this number we ignore it.
# But then we lower it, meaning we are less likely to think a bin has no value in it.
# Obviously it should never be negative.
zerosC = (0.001*.999)/20000*2*1.1
zerosDec = (zerosC*0.99)/maxPass


prevFound = 0
qubitSize = 6
j6=diagm(1 => vec(hcat([[1 0] for i = 1:6]...)[1:end-1]),-1 => vec(hcat([[1 0] for i = 1:6]...)[1:end-1]))

listOfX = [[fwht_natural([oracleToUse[x+1] for x in y]) for y in s] for s in samples]
found = Dict()
rmappings = []
for x in mappings
    if length(x) == 0
        push!(rmappings,x)
    else
        ralt = Dict()
        for i in keys(x)
            ralt[x[i]]= i
        end
        push!(rmappings,ralt)
    end
end    
prevFound = 0


for i in 1:maxPass



    for co = 1:length(listOfX)
        bucketSize = length(listOfX[co][1])
        for extractValue = 1:bucketSize
            extracted = [x[extractValue] for x in listOfX[co]]
            if !(PEEL.closeToZero(extracted,qubitSize*2,cutoff= zerosC))
               (isit,bits,val) = PEEL.checkAndExtractSingleton([extracted],qubitSize*2,cutoff=singletons)
               if isit
                  #print("$bits\n")
                  #pval = binaryArrayToNumber(j6*[x == '0' ?  0 : 1 for x in bits])
                  vval = parse(Int,bits,base=2)
                  #print("$bits, $vval $(round(dist[vval+1],digits=5)) and $(round(val,digits=5))\n")
                  PEEL.peelBack(listOfX,listOfPs,bits,val,found,ds,rmappings)
               end
            end
        end   
    end
    if length(found) > prevFound
                prevFound = length(found)
    else
        singletons += singletonsInc
        zerosC -=zerosDec 
        if (zerosC <= 0)
                break
        end
     end
     if length(found) > 0
                print("Pass $i, $(length(found)) $(sum([mean(found[x]) for x in keys(found)]))\n")
            if sum([mean(found[x]) for x in keys(found)]) >= 0.999995
                break
            end
     end


end
Pass 1, 8 0.9988247398205602
Pass 2, 8 0.9988247398205602
Pass 3, 8 0.9988247398205602
Pass 4, 8 0.9988247398205602
Pass 5, 8 0.9988247398205602
Pass 6, 9 0.9992720461601559
Pass 7, 9 0.9992720461601559
Pass 8, 10 0.9995543017770276
Pass 9, 10 0.9995543017770276
Pass 10, 10 0.9995543017770276
Pass 11, 10 0.9995543017770276
Pass 12, 10 0.9995543017770276
Pass 13, 10 0.9995543017770276
Pass 14, 10 0.9995543017770276
Pass 15, 10 0.9995543017770276
Pass 16, 10 0.9995543017770276
Pass 17, 10 0.9995543017770276
Pass 18, 10 0.9995543017770276
Pass 19, 10 0.9995543017770276
Pass 20, 11 0.999854001381265
Pass 21, 11 0.999854001381265
Pass 22, 11 0.999854001381265
Pass 23, 11 0.999854001381265
Pass 24, 11 0.999854001381265
Pass 25, 11 0.999854001381265
Pass 26, 11 0.999854001381265
Pass 27, 11 0.999854001381265
Pass 28, 11 0.999854001381265
Pass 29, 11 0.999854001381265
Pass 30, 11 0.999854001381265
Pass 31, 12 1.000093606020884
In [62]:
foundV = []
print("Estimate  <===>   Actual\n")
print("------------------------\n")


for x in keys(found)
    vval = parse(Int,x,base=2)
    push!(foundV,vval+1)
    print("$(probabilityLabels(vval,qubits=6)): $vval : $(round(mean(found[x]),digits=5)) <===>  $(round(dist[vval+1],digits=5))\n")
end
Estimate  <===>   Actual
------------------------
IIXIII: 128 : 0.00035 <===>  0.00029
ZIIZZI: 3132 : 0.00024 <===>  0.0
IZIXZZ: 815 : 0.00401 <===>  0.004
ZIIZZY: 3133 : 0.00028 <===>  0.0
IIIIIX: 2 : 0.00404 <===>  0.0039
IYIIII: 256 : 0.00391 <===>  0.00394
IIIZII: 48 : 0.0003 <===>  0.00029
IXIIII: 512 : 0.01978 <===>  0.01968
IIIIYI: 4 : 0.00045 <===>  0.00029
YIIIII: 1024 : 0.00039 <===>  0.0001
IIIIII: 0 : 0.95671 <===>  0.9566
IIIIIY: 1 : 0.00963 <===>  0.00974
In [63]:
title("Distribution of Pauli error rates")
yscale("log")
ylabel("Error Rate")
xlabel("Paulis indexed and sorted by error rate")
indices = reverse(sortperm(dist))[1:15]
scatter(1:15,[dist[x] for x in indices])
loc = [findfirst(x->x==y,indices) for y in foundV]
scatter(loc,[found[string(x-1,base=2,pad=12)] for x in foundV])
xticks(1:15,[probabilityLabels(x-1,qubits=6) for x in indices],rotation=90);

How will things improve if we up the shot count?

What we will see is that the accuracy of the recovery increases, but not necessarily the number recovered.

The recovery number after a $\delta$ of 0.25 requires we up the sub-sampling groups. This is described in the paper.

In [64]:
# So we kept the actual probabilities of each of the experiments
# And now we can increase the shot statistics to see how things improve:

# 1,000 sequences of 1,000 shots
manyMoreShots = 1000000

# We need culmlative probability matrixes
cumMatrix = map(cumsum,experiment1_allProbs)
experiment1_observed = shotSimulator(64,manyMoreShots,cumMatrix);
(params,l, failed) = fitTheFidelities(lengths,experiment1_observed)
experiment1_fidelities = vcat(1,[p[2] for p in params]) # We don't fit the first one, it is always 1 for CPTP maps
#Turn those into a number...
fidelitiesExtracted = []

for x0 in paulisAll[1][3]
    p56 = binaryArrayToNumber(x0)
    for x1 in paulisAll[1][2]
        p34 = binaryArrayToNumber(x1)
        for x2 in paulisAll[1][1]
            p12= binaryArrayToNumber(x2)
            push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
        end
    end
end

estimateOracle = [[] for _ in 1:4096]

for i in 1:64
    push!(estimateOracle[fidelitiesExtracted[i]],experiment1_fidelities[i])
end
In [65]:
# So I am going to just do all of the above for each of the 5 (other than the one already done) for each of the qubits
e1_all_additional_fidelities = []
e1_fidelity_extracted = []
# Note we don't actually need to save the actual probabilities - but I am going to use them later
# To demonstrate some different recovery regimes.
exp_count = 1
for qubitPairOn = 1:3 # qubit pairs here are 1&2, 3&4 and 5&6
    for experimentType = 1:5
        if experiments[1][qubitPairOn][2] != experimentType
            # Its one we haven't done
            expOnFirstPair  = experiments[1][1][2]
            expOnSecondPair = experiments[1][2][2]
            expOnThirdPair  = experiments[1][3][2]

            
            if qubitPairOn == 1
                expOnFirstPair = experimentType
            elseif qubitPairOn == 2
                expOnSecondPair = experimentType
            else
                expOnThirdPair = experimentType
            end
            
            # Get the actual probabilities previously saved.
            additionalExperiment = e1_all_actualProbabilities[exp_count]
            exp_count +=1
            # Generate the measurement statistics
            cumMatrix = map(cumsum,additionalExperiment)
            experiment1_additional_observed = shotSimulator(64,manyMoreShots,cumMatrix);
            # Fit and extract the fidelities
            (params,l, failed) = fitTheFidelities(lengths,experiment1_additional_observed)
            experiment1_additional_fidelities = vcat(1,[p[2] for p in params])
            push!(e1_all_additional_fidelities,experiment1_additional_fidelities)
            fidelitiesExtracted=[]
            for x0 in all2QlMuBs[expOnThirdPair]
                p56 = binaryArrayToNumber(x0)
                for x1 in all2QlMuBs[expOnSecondPair]
                    p34 = binaryArrayToNumber(x1)
                    for x2 in all2QlMuBs[expOnFirstPair]
                        p12= binaryArrayToNumber(x2)
                        push!(fidelitiesExtracted,p56*4^4+p34*16+p12+1) # + 1 cause we index of 1 in Julia
                    end
                end
            end
            push!(e1_fidelity_extracted,fidelitiesExtracted)
        end
    end
end
# So we just need to fill in the oracle

for (expNo,x) in enumerate(e1_fidelity_extracted)
    for (fidelityIndex,fidelity)  in enumerate(x)
        push!(estimateOracle[fidelity],e1_all_additional_fidelities[expNo][fidelityIndex])
    end
end
In [66]:
# Generate shot limited stats

cumMatrix = map(cumsum,experiment2)
experiment2_observed = shotSimulator(64,manyMoreShots,cumMatrix);
# Fit and extract the fidelities
(params,l, failed) = fitTheFidelities(lengths,experiment2_observed)
experiment2_fidelities = vcat(1,[p[2] for p in params])
fidelitiesExtracted = []
for x1b in potentialSingles[experiments[2][4][2]] # Experiment 2, qubit division 3 (q1, q2&3, q4) the second part = exp number
    p6 = binaryArrayToNumber(x1b)
    for x2b in all2QlMuBs[experiments[2][3][2]]
        p45 = binaryArrayToNumber(x2b)
        for x2a in all2QlMuBs[experiments[2][2][2]]
            p23 = binaryArrayToNumber(x2a)
            for x1a in potentialSingles[experiments[2][1][2]]
                p1 = binaryArrayToNumber(x1a)
                push!(fidelitiesExtracted,p6*4^5+p45*4^3+p23*4^1+p1+1) # + 1 cause we index of 1 in Julia
             end
        end
    end
end


for (fidelityIndex,fidelity)  in enumerate(fidelitiesExtracted)
    push!(estimateOracle[fidelity],experiment2_fidelities[fidelityIndex])
end
In [67]:
e2_all_additional_fidelities = []
e2_fidelity_extracted = []
# Note here I am hard coding that we have single qubit twirls on 1 and 4.
e_count = 0
exp_count = 1
for qubitOn in 1:4 # qubits we can be "On" here are 1, 2&3 4&5 and 6
    
    if qubitOn == 2 || qubitOn == 3
        noOfExperiments = 5
    else 
        noOfExperiments = 3 # Only 3 if its a single qubit.
    end
    for experimentType = 1:noOfExperiments
        if experiments[2][qubitOn][2] != experimentType
            # Its one we haven't done
            expOnFirstSet = experiments[2][1][2]
            expOnSecondSet = experiments[2][2][2]
            expOnThirdSet = experiments[2][3][2]
            expOnFourthSet = experiments[2][4][2]


            if qubitOn == 1
                expOnFirstSet = experimentType
            elseif qubitOn == 2
                expOnSecondSet = experimentType
            elseif qubitOn ==3 
                expOnThirdSet = experimentType
            else 
                expOnFourthSet = experimentType
            end
                
            additionalExperiment = e2_all_actualProbabilities[exp_count]
            exp_count+=1
            
            # Generate the measurement statistics
            cumMatrix = map(cumsum,additionalExperiment)
            experiment2_additional_observed =  shotSimulator(64,manyMoreShots,cumMatrix);
            # Fit and extract the fidelities
            (params,l, failed) = fitTheFidelities(lengths,experiment2_additional_observed)
            experiment2_additional_fidelities = vcat(1,[p[2] for p in params])
            push!(e2_all_additional_fidelities,experiment2_additional_fidelities)
            fidelitiesExtracted=[]
            for x1 in potentialSingles[expOnFourthSet] # Experiment 2, qubit division 3 (q1, q2&3, q4&5 q6) the second part = exp number
                p6 = binaryArrayToNumber(x1)
                for x2b in all2QlMuBs[expOnThirdSet]
                    p45 = binaryArrayToNumber(x2b)
                    for x2 in all2QlMuBs[expOnSecondSet]
                        p23 = binaryArrayToNumber(x2)
                        for x3 in potentialSingles[expOnFirstSet]
                            p1 = binaryArrayToNumber(x3)
                            push!(fidelitiesExtracted,p6*4^5+p45*4^3+p23*4+p1+1) # + 1 cause we index of 1 in Julia
                        end
                     end
                end
            end
            push!(e2_fidelity_extracted,fidelitiesExtracted)
        end
    end
end

# So we just need to fill in the oracle

for (expNo,x) in enumerate(e2_fidelity_extracted)
    for (fidelityIndex,fidelity)  in enumerate(x)
        push!(estimateOracle[fidelity],e2_all_additional_fidelities[expNo][fidelityIndex])
    end
end
In [68]:
millionShotOracle = [length(x) > 0 ? mean(x) : 0 for x in estimateOracle]
Out[68]:
4096-element Array{Real,1}:
 1.0
 0.9719885180800645
 0.983979552464777
 0.9719979638905242
 0.991397670766155
 0.9794047891968238
 0.9913805110959736
 0.963418819353371
 0.9917692042639024
 0.9797838796499244
 0.9918044072561205
 0.9637864067913748
 0.9991981128146965
 ⋮
 0
 0
 0
 0
 0.9497286952423499
 0
 0
 0.9231080136842243
 0
 0
 0
 0
In [69]:
using LinearAlgebra
maxPass = 200
# singletons is when 'noise' threshold below which we declare we have found a singletons
# It will be related to the measurment accuracy and the number of bins
# Here we base it off the shotsToDo variance, on the basis of our hoped for recovery

# We start that one low and then slowly increase it, meaning we are more likely to accept
# If you have a certain probability distribution and this ansatz is not working, set it
# so that you get a reasonable number of hits in the first round.
singletons = (0.001*.999)/30000
singletonsInc = singletons/2

# Zeros is set high - we don't want to accept bins with very low numbers as they are probably just noise
# If the (sum(mean - value)^2) for all the offsets is below this number we ignore it.
# But then we lower it, meaning we are less likely to think a bin has no value in it.
# Obviously it should never be negative.
zerosC = (0.01*.999)/20000*2*1.1
zerosDec = (zerosC*0.99)/maxPass



prevFound = 0
qubitSize = 6
j6=diagm(1 => vec(hcat([[1 0] for i = 1:6]...)[1:end-1]),-1 => vec(hcat([[1 0] for i = 1:6]...)[1:end-1]))

listOfX = [[fwht_natural([millionShotOracle[x+1] for x in y]) for y in s] for s in samples]
found = Dict()
rmappings = []
for x in mappings
    if length(x) == 0
        push!(rmappings,x)
    else
        ralt = Dict()
        for i in keys(x)
            ralt[x[i]]= i
        end
        push!(rmappings,ralt)
    end
end    
prevFound = 0


for i in 1:maxPass



    for co = 1:length(listOfX)
        bucketSize = length(listOfX[co][1])
        for extractValue = 1:bucketSize
            extracted = [x[extractValue] for x in listOfX[co]]
            if !(PEEL.closeToZero(extracted,qubitSize*2,cutoff= zerosC))
               (isit,bits,val) = PEEL.checkAndExtractSingleton([extracted],qubitSize*2,cutoff=singletons)
               if isit
                  #print("$bits\n")
                  #pval = binaryArrayToNumber(j6*[x == '0' ?  0 : 1 for x in bits])
                  vval = parse(Int,bits,base=2)
                  #print("$bits, $vval $(round(dist[vval+1],digits=5)) and $(round(val,digits=5))\n")
                  PEEL.peelBack(listOfX,listOfPs,bits,val,found,ds,rmappings)
               end
            end
        end   
    end
    if length(found) > prevFound
                prevFound = length(found)
    else
        singletons += singletonsInc
        zerosC -=zerosDec 
        if (zerosC <= 0)
                break
        end
     end
     if length(found) > 0
            print("Pass $i, $(length(found)) $(sum([mean(found[x]) for x in keys(found)]))\n")
            if sum([mean(found[x]) for x in keys(found)]) >= 0.999995
                break
            end
     end


end
print("Terminated finding $(length(found)) Paulis, total of $(sum([mean(found[x]) for x in keys(found)]) ) worth of probability!\n")
Pass 1, 6 0.9979370870188927
Pass 2, 6 0.9979370870188927
Pass 3, 6 0.9979370870188927
Pass 4, 6 0.9979370870188927
Pass 5, 6 0.9979370870188927
Pass 6, 6 0.9979370870188927
Pass 7, 6 0.9979370870188927
Pass 8, 6 0.9979370870188927
Pass 9, 6 0.9979370870188927
Pass 10, 6 0.9979370870188927
Pass 11, 6 0.9979370870188927
Pass 12, 6 0.9979370870188927
Pass 13, 6 0.9979370870188927
Pass 14, 6 0.9979370870188927
Pass 15, 6 0.9979370870188927
Pass 16, 6 0.9979370870188927
Pass 17, 6 0.9979370870188927
Pass 18, 6 0.9979370870188927
Pass 19, 6 0.9979370870188927
Pass 20, 6 0.9979370870188927
Pass 21, 6 0.9979370870188927
Pass 22, 6 0.9979370870188927
Pass 23, 6 0.9979370870188927
Pass 24, 6 0.9979370870188927
Pass 25, 6 0.9979370870188927
Pass 26, 6 0.9979370870188927
Pass 27, 6 0.9979370870188927
Pass 28, 6 0.9979370870188927
Pass 29, 6 0.9979370870188927
Pass 30, 6 0.9979370870188927
Pass 31, 6 0.9979370870188927
Pass 32, 6 0.9979370870188927
Pass 33, 6 0.9979370870188927
Pass 34, 6 0.9979370870188927
Pass 35, 6 0.9979370870188927
Pass 36, 6 0.9979370870188927
Pass 37, 6 0.9979370870188927
Pass 38, 6 0.9979370870188927
Pass 39, 6 0.9979370870188927
Pass 40, 6 0.9979370870188927
Pass 41, 6 0.9979370870188927
Pass 42, 6 0.9979370870188927
Pass 43, 6 0.9979370870188927
Pass 44, 6 0.9979370870188927
Pass 45, 6 0.9979370870188927
Pass 46, 6 0.9979370870188927
Pass 47, 6 0.9979370870188927
Pass 48, 6 0.9979370870188927
Pass 49, 6 0.9979370870188927
Pass 50, 6 0.9979370870188927
Pass 51, 6 0.9979370870188927
Pass 52, 6 0.9979370870188927
Pass 53, 6 0.9979370870188927
Pass 54, 6 0.9979370870188927
Pass 55, 6 0.9979370870188927
Pass 56, 6 0.9979370870188927
Pass 57, 6 0.9979370870188927
Pass 58, 6 0.9979370870188927
Pass 59, 6 0.9979370870188927
Pass 60, 6 0.9979370870188927
Pass 61, 6 0.9979370870188927
Pass 62, 6 0.9979370870188927
Pass 63, 6 0.9979370870188927
Pass 64, 6 0.9979370870188927
Pass 65, 6 0.9979370870188927
Pass 66, 6 0.9979370870188927
Pass 67, 6 0.9979370870188927
Pass 68, 6 0.9979370870188927
Pass 69, 6 0.9979370870188927
Pass 70, 6 0.9979370870188927
Pass 71, 6 0.9979370870188927
Pass 72, 6 0.9979370870188927
Pass 73, 6 0.9979370870188927
Pass 74, 6 0.9979370870188927
Pass 75, 6 0.9979370870188927
Pass 76, 6 0.9979370870188927
Pass 77, 6 0.9979370870188927
Pass 78, 6 0.9979370870188927
Pass 79, 6 0.9979370870188927
Pass 80, 6 0.9979370870188927
Pass 81, 6 0.9979370870188927
Pass 82, 6 0.9979370870188927
Pass 83, 6 0.9979370870188927
Pass 84, 6 0.9979370870188927
Pass 85, 6 0.9979370870188927
Pass 86, 6 0.9979370870188927
Pass 87, 6 0.9979370870188927
Pass 88, 6 0.9979370870188927
Pass 89, 6 0.9979370870188927
Pass 90, 6 0.9979370870188927
Pass 91, 6 0.9979370870188927
Pass 92, 6 0.9979370870188927
Pass 93, 6 0.9979370870188927
Pass 94, 6 0.9979370870188927
Pass 95, 6 0.9979370870188927
Pass 96, 6 0.9979370870188927
Pass 97, 6 0.9979370870188927
Pass 98, 6 0.9979370870188927
Pass 99, 6 0.9979370870188927
Pass 100, 6 0.9979370870188927
Pass 101, 6 0.9979370870188927
Pass 102, 6 0.9979370870188927
Pass 103, 6 0.9979370870188927
Pass 104, 6 0.9979370870188927
Pass 105, 6 0.9979370870188927
Pass 106, 6 0.9979370870188927
Pass 107, 6 0.9979370870188927
Pass 108, 6 0.9979370870188927
Pass 109, 6 0.9979370870188927
Pass 110, 6 0.9979370870188927
Pass 111, 6 0.9979370870188927
Pass 112, 6 0.9979370870188927
Pass 113, 6 0.9979370870188927
Pass 114, 6 0.9979370870188927
Pass 115, 6 0.9979370870188927
Pass 116, 6 0.9979370870188927
Pass 117, 6 0.9979370870188927
Pass 118, 6 0.9979370870188927
Pass 119, 6 0.9979370870188927
Pass 120, 6 0.9979370870188927
Pass 121, 6 0.9979370870188927
Pass 122, 6 0.9979370870188927
Pass 123, 6 0.9979370870188927
Pass 124, 6 0.9979370870188927
Pass 125, 6 0.9979370870188927
Pass 126, 6 0.9979370870188927
Pass 127, 6 0.9979370870188927
Pass 128, 6 0.9979370870188927
Pass 129, 6 0.9979370870188927
Pass 130, 6 0.9979370870188927
Pass 131, 6 0.9979370870188927
Pass 132, 6 0.9979370870188927
Pass 133, 6 0.9979370870188927
Pass 134, 6 0.9979370870188927
Pass 135, 6 0.9979370870188927
Pass 136, 6 0.9979370870188927
Pass 137, 6 0.9979370870188927
Pass 138, 6 0.9979370870188927
Pass 139, 6 0.9979370870188927
Pass 140, 6 0.9979370870188927
Pass 141, 6 0.9979370870188927
Pass 142, 6 0.9979370870188927
Pass 143, 6 0.9979370870188927
Pass 144, 6 0.9979370870188927
Pass 145, 6 0.9979370870188927
Pass 146, 6 0.9979370870188927
Pass 147, 6 0.9979370870188927
Pass 148, 6 0.9979370870188927
Pass 149, 6 0.9979370870188927
Pass 150, 6 0.9979370870188927
Pass 151, 6 0.9979370870188927
Pass 152, 6 0.9979370870188927
Pass 153, 6 0.9979370870188927
Pass 154, 6 0.9979370870188927
Pass 155, 6 0.9979370870188927
Pass 156, 7 0.9983899128189936
Pass 157, 7 0.9983899128189936
Pass 158, 7 0.9983899128189936
Pass 159, 7 0.9983899128189936
Pass 160, 7 0.9983899128189936
Pass 161, 7 0.9983899128189936
Pass 162, 7 0.9983899128189936
Pass 163, 7 0.9983899128189936
Pass 164, 7 0.9983899128189936
Pass 165, 7 0.9983899128189936
Pass 166, 7 0.9983899128189936
Pass 167, 7 0.9983899128189936
Pass 168, 7 0.9983899128189936
Pass 169, 7 0.9983899128189936
Pass 170, 7 0.9983899128189936
Pass 171, 7 0.9983899128189936
Pass 172, 7 0.9983899128189936
Pass 173, 7 0.9983899128189936
Pass 174, 7 0.9983899128189936
Pass 175, 7 0.9983899128189936
Pass 176, 7 0.9983899128189936
Pass 177, 7 0.9983899128189936
Pass 178, 7 0.9983899128189936
Pass 179, 7 0.9983899128189936
Pass 180, 7 0.9983899128189936
Pass 181, 7 0.9983899128189936
Pass 182, 8 0.998744776853889
Pass 183, 8 0.998744776853889
Pass 184, 8 0.998744776853889
Pass 185, 8 0.998744776853889
Pass 186, 8 0.998744776853889
Pass 187, 8 0.998744776853889
Pass 188, 8 0.998744776853889
Pass 189, 8 0.998744776853889
Pass 190, 8 0.998744776853889
Pass 191, 9 0.9990320240533556
Pass 192, 9 0.9990320240533556
Pass 193, 10 0.9993160693653758
Pass 194, 10 0.9993160693653758
Pass 195, 11 0.9995875209269763
Pass 196, 11 0.9995875209269763
Pass 197, 12 0.9998371797056896
Pass 198, 12 0.9998371797056896
Pass 199, 12 0.9998371797056896
Pass 200, 12 0.9998371797056896
Terminated finding 12 Paulis, total of 0.9998371797056896 worth of probability!
In [70]:
foundV = []
for x in keys(found)
    vval = parse(Int,x,base=2)
    push!(foundV,vval+1)
    print("$(round(mean(found[x]),digits=6)) <===> $(round(dist[vval+1],digits=6))\n")
end
title("Distribution of Pauli error rates")
yscale("log")
ylabel("Error Rate")
xlabel("Paulis indexed and sorted by error rate")
indices = reverse(sortperm(dist))[2:15]
scatter(1:14,[dist[x] for x in indices])
loc = [findfirst(x->x==y,indices) for y in foundV]
scatter(loc,[found[string(x-1,base=2,pad=12)] for x in foundV])
xticks(1:15,[probabilityLabels(x-1,qubits=6) for x in indices],rotation=90);
0.00025 <===> 0.000192
0.000284 <===> 0.000288
0.003997 <===> 0.004
0.003898 <===> 0.003897
0.003929 <===> 0.003937
0.000355 <===> 0.000288
0.000453 <===> 0.000288
0.019743 <===> 0.019685
0.000287 <===> 0.000288
0.956614 <===> 0.956605
0.000271 <===> 0.0002
0.009757 <===> 0.009742
In [ ]: